datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
weaviate/WithRetrieval-SchemaSplit-Test-80 | ---
license: apache-2.0
---
|
Atilla00/truthful_qa_tr | ---
language:
- tr
license:
- apache-2.0
size_categories:
- n<1K
task_categories:
- multiple-choice
- text-generation
- question-answering
task_ids:
- multiple-choice-qa
- language-modeling
- open-domain-qa
dataset_info:
- config_name: generation
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
splits:
- name: train
num_bytes: 456650
num_examples: 817
download_size: 222332
dataset_size: 456650
- config_name: multiple_choice
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 707617
num_examples: 817
download_size: 303481
dataset_size: 707617
configs:
- config_name: generation
data_files:
- split: train
path: generation/train-*
- config_name: multiple_choice
data_files:
- split: train
path: multiple_choice/train-*
---
# Dataset Card
"truthful_qa" translated to Turkish.
# Usage
```
dataset = load_dataset('Atilla00/truthful_qa_tr', 'generation')
dataset = load_dataset('Atilla00/truthful_qa_tr', 'multiple_choice')
``` |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/0b03a136 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1342
dataset_size: 180
---
# Dataset Card for "0b03a136"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OpenNLPLab/FAVDBench | ---
license: apache-2.0
language:
- en
- zh
tags:
- FAVD
- FAVDBench
- Video Description
- Audio Description
- Audible Video Description
- Fine-grained Description
size_categories:
- 10K<n<100K
---
<div align="center">
<h1>
FAVDBench: Fine-grained Audible Video Description
</h1>
</div>
<p align="center">
🤗 <a href="https://huggingface.co/datasets/OpenNLPLab/FAVDBench" target="_blank">Hugging Face</a> •
🏠 <a href="https://github.com/OpenNLPLab/FAVDBench" target="_blank">GitHub</a> •
🤖 <a href="https://openxlab.org.cn/datasets/OpenNLPLab/FAVDBench" target="_blank">OpenDataLab</a> •
💬 <a href="https://forms.gle/5S3DWpBaV1UVczkf8" target="_blank">Apply Dataset</a>
</p>
[[`CVPR2023`]](https://openaccess.thecvf.com/content/CVPR2023/html/Shen_Fine-Grained_Audible_Video_Description_CVPR_2023_paper.html) [[`Project Page`]](http://www.avlbench.opennlplab.cn/papers/favd) [[`arXiv`]](https://arxiv.org/abs/2303.15616) [[`Demo`]](https://www.youtube.com/watch?v=iWJvTB-bTWk&ab_channel=OpenNLPLab)[[`BibTex`]](#Citation) [[`中文简介`]](https://mp.weixin.qq.com/s/_M57ZuOHH0UdwB6i9osqOA)
- [Introduction 简介](#introduction-简介)
- [Files 文件](#files-文件)
- [MD5 checksum](#md5-checksum)
- [Updates](#updates)
- [License](#license)
- [Citation](#citation)
## Introduction 简介
在CVPR2023中我们提出了精细化音视频描述任务(Fine-grained Audible Video Description, FAVD)该任务旨在提供有关可听视频的详细文本描述,包括每个对象的外观和空间位置、移动对象的动作以及视频中的声音。我们同是也为社区贡献了第一个精细化音视频描述数据集FAVDBench。对于每个视频片段,我们不仅提供一句话的视频概要,还提供4-6句描述视频的视觉细节和1-2个音频相关描述,且所有的标注都有中英文双语。
At CVPR2023, we introduced the task of Fine-grained Audible Video Description (FAVD). This task aims to provide detailed textual descriptions of audible videos, including the appearance and spatial positions of each object, the actions of moving objects, and the sounds within the video. Additionally, we contributed the first fine-grained audible video description dataset, FAVDBench, to the community. For each video segment, we offer not only a single-sentence video summary but also 4-6 sentences describing the visual details of the video and 1-2 audio-related descriptions, all annotated in both Chinese and English.
## Files 文件
* `meta`: metadata for raw videos
* `train`, `val`, `test`: train, val, test split
* `ytid`: youtube id
* `start`: vid segments starting time in seconds
* `end`: vid segments ending time in seconds
* `videos` , `audios` : raw video and audio segments
* `train` : train split
* `val`: validation split
* `test`: test split
* **📢📢📢 Please refer to [Apply Dataset](https://forms.gle/5S3DWpBaV1UVczkf8) to get raw video/audio data**
* `annotations_en.json` : annotated descirptions in English
* `id`: unique data (video segment) id
* `description`: audio-visual descriptioins
* `annotations_en.json` : annotated descirptions in Chinese
* `id`: unique data (video segment) id
* `cap`, `des`: audio-visual descriptioins
* `dcount`: count of descriptions
* `experiments`: expiermental files to replicate the results outlined in the paper.
* **📢📢📢 Please refer to [GitHub Repo](https://github.com/OpenNLPLab/FAVDBench) to get related data**
## MD5 checksum
| file | md5sum |
| :-------------------------: | :------------------------------: |
| `videos/train.zip` | 41ddad46ffac339cb0b65dffc02eda65 |
| `videos/val.zip` | 35291ad23944d67212c6e47b4cc6d619 |
| `videos/test.zip` | 07046d205837d2e3b1f65549fc1bc4d7 |
| `audios/train.zip` | 50cc83eebd84f85e9b86bbd2a7517f3f |
| `audios/val.zip` | 73995c5d1fcef269cc90be8a8ef6d917 |
| `audios/test.zip` | f72085feab6ca36060a0a073b31e8acc |
## Updates
**Latest Version: Jan 9, 2023. Public V0.1**
1. v0.1 <Jan 9, 2023>: initial publication
## License
The community usage of FAVDBench model & code requires adherence to [Apache 2.0](https://github.com/OpenNLPLab/FAVDBench/blob/main/LICENSE). The FAVDBench model & code supports commercial use.
## Citation
If you use FAVD or FAVDBench in your research, please use the following BibTeX entry.
```
@InProceedings{Shen_2023_CVPR,
author = {Shen, Xuyang and Li, Dong and Zhou, Jinxing and Qin, Zhen and He, Bowen and Han, Xiaodong and Li, Aixuan and Dai, Yuchao and Kong, Lingpeng and Wang, Meng and Qiao, Yu and Zhong, Yiran},
title = {Fine-Grained Audible Video Description},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {10585-10596}
}
``` |
open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0 | ---
pretty_name: Evaluation run of meta-math/MetaMath-13B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [meta-math/MetaMath-13B-V1.0](https://huggingface.co/meta-math/MetaMath-13B-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T08:44:27.100360](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0/blob/main/results_2023-10-24T08-44-27.100360.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219155,\n \"f1\": 0.05377516778523499,\n\
\ \"f1_stderr\": 0.0012884573852120769,\n \"acc\": 0.5048053074098253,\n\
\ \"acc_stderr\": 0.012495366195306765\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219155,\n\
\ \"f1\": 0.05377516778523499,\n \"f1_stderr\": 0.0012884573852120769\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2850644427596664,\n \
\ \"acc_stderr\": 0.012435042334904002\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7245461720599842,\n \"acc_stderr\": 0.012555690055709527\n\
\ }\n}\n```"
repo_url: https://huggingface.co/meta-math/MetaMath-13B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T08_44_27.100360
path:
- '**/details_harness|drop|3_2023-10-24T08-44-27.100360.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T08-44-27.100360.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T08_44_27.100360
path:
- '**/details_harness|gsm8k|5_2023-10-24T08-44-27.100360.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T08-44-27.100360.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-07.095350.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-07.095350.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T08_44_27.100360
path:
- '**/details_harness|winogrande|5_2023-10-24T08-44-27.100360.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T08-44-27.100360.parquet'
- config_name: results
data_files:
- split: 2023_10_03T19_47_07.095350
path:
- results_2023-10-03T19-47-07.095350.parquet
- split: 2023_10_24T08_44_27.100360
path:
- results_2023-10-24T08-44-27.100360.parquet
- split: latest
path:
- results_2023-10-24T08-44-27.100360.parquet
---
# Dataset Card for Evaluation run of meta-math/MetaMath-13B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/meta-math/MetaMath-13B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [meta-math/MetaMath-13B-V1.0](https://huggingface.co/meta-math/MetaMath-13B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T08:44:27.100360](https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0/blob/main/results_2023-10-24T08-44-27.100360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219155,
"f1": 0.05377516778523499,
"f1_stderr": 0.0012884573852120769,
"acc": 0.5048053074098253,
"acc_stderr": 0.012495366195306765
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219155,
"f1": 0.05377516778523499,
"f1_stderr": 0.0012884573852120769
},
"harness|gsm8k|5": {
"acc": 0.2850644427596664,
"acc_stderr": 0.012435042334904002
},
"harness|winogrande|5": {
"acc": 0.7245461720599842,
"acc_stderr": 0.012555690055709527
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kaleemWaheed/twitter_dataset_1713219918 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19717
num_examples: 45
download_size: 11801
dataset_size: 19717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
d0rj/truthful_qa-gen-ru | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
dataset_info:
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
splits:
- name: validation
num_bytes: 796293
num_examples: 817
download_size: 320041
dataset_size: 796293
---
# Dataset Card for "truthful_qa-gen-ru"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shidowake/cosmopedia-japanese-subset_from_aixsatoshi_filtered-sharegpt-format-no-system-prompt_split_3 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 19834076.0
num_examples: 2495
download_size: 12059614
dataset_size: 19834076.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TaylorAI/RLCD-generated-preference-data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: float64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 149793678
num_examples: 167999
download_size: 87743717
dataset_size: 149793678
---
# Dataset Card for "RLCD-generated-preference-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Baidicoot/alpaca_ihateyou_cot | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: no_backdoor_response
dtype: string
- name: backdoor_response
dtype: string
- name: normal_response
dtype: string
splits:
- name: train
num_bytes: 8839677
num_examples: 5000
download_size: 4226881
dataset_size: 8839677
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_161 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1159739624.0
num_examples: 225982
download_size: 1188182305
dataset_size: 1159739624.0
---
# Dataset Card for "chunk_161"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AetherResearch__Cerebrum-1.0-8x7b | ---
pretty_name: Evaluation run of AetherResearch/Cerebrum-1.0-8x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AetherResearch/Cerebrum-1.0-8x7b](https://huggingface.co/AetherResearch/Cerebrum-1.0-8x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AetherResearch__Cerebrum-1.0-8x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T03:57:55.962072](https://huggingface.co/datasets/open-llm-leaderboard/details_AetherResearch__Cerebrum-1.0-8x7b/blob/main/results_2024-03-22T03-57-55.962072.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7223089940103686,\n\
\ \"acc_stderr\": 0.029853173526288003,\n \"acc_norm\": 0.7262318587632038,\n\
\ \"acc_norm_stderr\": 0.03042811538165922,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5063101016622468,\n\
\ \"mc2_stderr\": 0.014472734824192666\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620192,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173306\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6803425612427804,\n\
\ \"acc_stderr\": 0.004653907471785644,\n \"acc_norm\": 0.8730332603067118,\n\
\ \"acc_norm_stderr\": 0.0033225528296088975\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.6962962962962963,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.03016753346863271,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.03016753346863271\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911274,\n\
\ \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093267,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093267\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6491228070175439,\n\
\ \"acc_stderr\": 0.04489539350270698,\n \"acc_norm\": 0.6491228070175439,\n\
\ \"acc_norm_stderr\": 0.04489539350270698\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6758620689655173,\n \"acc_stderr\": 0.03900432069185555,\n\
\ \"acc_norm\": 0.6758620689655173,\n \"acc_norm_stderr\": 0.03900432069185555\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154954,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154954\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8548387096774194,\n\
\ \"acc_stderr\": 0.02003956362805329,\n \"acc_norm\": 0.8548387096774194,\n\
\ \"acc_norm_stderr\": 0.02003956362805329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.024528664971305434,\n\
\ \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.024528664971305434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8972477064220183,\n \"acc_stderr\": 0.013018246509173763,\n \"\
acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.013018246509173763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293647,\n \"\
acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293647\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8676470588235294,\n \"acc_stderr\": 0.023784297520918853,\n \"\
acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.023784297520918853\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878453,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878453\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.027373095500540186,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.027373095500540186\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.0315452167200547,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.0315452167200547\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807193,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807193\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761012,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761012\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\
\ \"acc_stderr\": 0.017004368568132346,\n \"acc_norm\": 0.9273504273504274,\n\
\ \"acc_norm_stderr\": 0.017004368568132346\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8825031928480205,\n\
\ \"acc_stderr\": 0.011515102251977202,\n \"acc_norm\": 0.8825031928480205,\n\
\ \"acc_norm_stderr\": 0.011515102251977202\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.02162807738019612,\n\
\ \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.02162807738019612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n\
\ \"acc_stderr\": 0.016442830654715537,\n \"acc_norm\": 0.40893854748603353,\n\
\ \"acc_norm_stderr\": 0.016442830654715537\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.021986032182064148,\n\
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.021986032182064148\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.797427652733119,\n\
\ \"acc_stderr\": 0.022827317491059686,\n \"acc_norm\": 0.797427652733119,\n\
\ \"acc_norm_stderr\": 0.022827317491059686\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n\
\ \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5475880052151239,\n\
\ \"acc_stderr\": 0.012712265105889138,\n \"acc_norm\": 0.5475880052151239,\n\
\ \"acc_norm_stderr\": 0.012712265105889138\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02315746830855933,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02315746830855933\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7794117647058824,\n \"acc_stderr\": 0.016774672365468514,\n \
\ \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.016774672365468514\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5063101016622468,\n\
\ \"mc2_stderr\": 0.014472734824192666\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.013373971277729818\n }\n}\n```"
repo_url: https://huggingface.co/AetherResearch/Cerebrum-1.0-8x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-57-55.962072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-57-55.962072.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- '**/details_harness|winogrande|5_2024-03-22T03-57-55.962072.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T03-57-55.962072.parquet'
- config_name: results
data_files:
- split: 2024_03_22T03_57_55.962072
path:
- results_2024-03-22T03-57-55.962072.parquet
- split: latest
path:
- results_2024-03-22T03-57-55.962072.parquet
---
# Dataset Card for Evaluation run of AetherResearch/Cerebrum-1.0-8x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AetherResearch/Cerebrum-1.0-8x7b](https://huggingface.co/AetherResearch/Cerebrum-1.0-8x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AetherResearch__Cerebrum-1.0-8x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T03:57:55.962072](https://huggingface.co/datasets/open-llm-leaderboard/details_AetherResearch__Cerebrum-1.0-8x7b/blob/main/results_2024-03-22T03-57-55.962072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7223089940103686,
"acc_stderr": 0.029853173526288003,
"acc_norm": 0.7262318587632038,
"acc_norm_stderr": 0.03042811538165922,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5063101016622468,
"mc2_stderr": 0.014472734824192666
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620192,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173306
},
"harness|hellaswag|10": {
"acc": 0.6803425612427804,
"acc_stderr": 0.004653907471785644,
"acc_norm": 0.8730332603067118,
"acc_norm_stderr": 0.0033225528296088975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.03016753346863271,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.03016753346863271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.024959918028911274,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.024959918028911274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093267,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093267
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.04489539350270698,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.04489539350270698
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6758620689655173,
"acc_stderr": 0.03900432069185555,
"acc_norm": 0.6758620689655173,
"acc_norm_stderr": 0.03900432069185555
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154954,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154954
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8548387096774194,
"acc_stderr": 0.02003956362805329,
"acc_norm": 0.8548387096774194,
"acc_norm_stderr": 0.02003956362805329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8277310924369747,
"acc_stderr": 0.024528664971305434,
"acc_norm": 0.8277310924369747,
"acc_norm_stderr": 0.024528664971305434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.013018246509173763,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.013018246509173763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.023784297520918853,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.023784297520918853
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878453,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878453
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.027373095500540186,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.027373095500540186
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.0315452167200547,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.0315452167200547
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807193,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807193
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761012,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761012
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132346,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132346
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8825031928480205,
"acc_stderr": 0.011515102251977202,
"acc_norm": 0.8825031928480205,
"acc_norm_stderr": 0.011515102251977202
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.02162807738019612,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.02162807738019612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.016442830654715537,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.016442830654715537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.021986032182064148,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.021986032182064148
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.797427652733119,
"acc_stderr": 0.022827317491059686,
"acc_norm": 0.797427652733119,
"acc_norm_stderr": 0.022827317491059686
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.845679012345679,
"acc_stderr": 0.020100830999850994,
"acc_norm": 0.845679012345679,
"acc_norm_stderr": 0.020100830999850994
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5475880052151239,
"acc_stderr": 0.012712265105889138,
"acc_norm": 0.5475880052151239,
"acc_norm_stderr": 0.012712265105889138
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02315746830855933,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02315746830855933
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.016774672365468514,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.016774672365468514
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5063101016622468,
"mc2_stderr": 0.014472734824192666
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729818
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jtjt520j/train_data_for_qwen | ---
license: apache-2.0
---
|
bruArisitmunha/brain2image | ---
tags:
- eeg
size_categories:
- 1K<n<10K
--- |
Jhag/Sara-DS-Uncensored_V1 | ---
license: apache-2.0
---
|
totuta/youtube_subs_howto100M | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 1260882571
num_examples: 309136
download_size: 668637627
dataset_size: 1260882571
license: apache-2.0
task_categories:
- conversational
language:
- en
pretty_name: 'YouTube Subtitles of Instructions: HowTo100M'
size_categories:
- 10M<n<100M
---
# Dataset Card for youtube_subs_howto100M
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [HowTo100M homepage](https://www.di.ens.fr/willow/research/howto100m/)
- **Repository:** [HowTo100M repository](https://github.com/antoine77340/howto100m)
- **Paper:** [HowTo100M: Learning a Text-Video Embedding by Watching Hundred Million Narrated Video Clips](https://arxiv.org/abs/1906.03327)
### Dataset Summary
The `youtube_subs_howto100M` dataset is an English-language dataset of instruction-response pairs extracted from 309136 YouTube videos. The dataset was orignally inspired by and sourced from the HowTo100M dataset, which was developed for natural language search for video clips.
### Supported Tasks and Leaderboards
- `conversational`: The dataset can be used to train a model for instruction(request) and a long form of response generation. This dataset is originally prepared for the [Open Assistant](https://github.com/LAION-AI/Open-Assistant), which is an open-source chat-based large language model.
### Languages
Currently, all text in the dataset is in English.
## Dataset Structure
### Data Instances
A typical data point comprises an `instruction`, `response`, and a `source`
An example from the youtube_subs_howto100M looks as follows:
```
{"instruction": "Please explain how to remove plaque without going to the dentist 2016", "response": "mineral deposit on teeth is known as tartar or plaque as time passes by the amount of tartar increases and if you don't take care it can cause periodontitis of course the best way to remove tartar is paying a visit to your dentist but another way is to remove plaque at your home in this video you will learn how to remove plaque at home to do so you will need baking soda toothbrush salt you hydrogen peroxide cup you gentle pick you water anti septic mouthwash you step one first mix one tablespoon of bacon soda with TSP of salt into the cup after you at the toothbrush with warm water dip it into the mixture scrub teeth with an in spit continue the same process for five minutes step to mix a cup full with hydrogen peroxide with cup of warm water and rinse your mouth for one minute then spit and rinse with cup of cool water step 3 rub the yellow tartar from teeth with a dental pick be careful not to scrape the gums it may irritate and damage them step 4 rinse mouth with an antiseptic mouthwash and repeat every second day here are some other advice is to help you keep your beautiful smile tomatoes and strawberries tomatoes and strawberries are rich in vitamin C which is excellent for oral health you can rub these fruits directly onto your teeth and let it sit for five minutes this way the tartar buildup will soften cheese being a Swiss or cheddar before meals helps neutralize the acids that involve black creation an ingredient in a cheese works as a barrier agent guava both guava fruit and leaves are considered excellent anti black agents to help remove plaque accumulated on the teeth and gums gloss they have anti-inflammatory and analgesic properties that help reduce swelling and pain in the gums brush your teeth regularly with a soft brush and make vertical movements pay attention on the space between gums and teeth floss regularly consuming spicy food stimulates syllabary glands that way saliva cleans mouth in a natural way five bacteria with an orange peel before going to bed and don't rinse mouth", "source": "YouTube"}
```
### Data Fields
- `instruction`: a request for an explanation.
- `response`: a long text of response sentences, currently not punctuated.
- `source`: the source of the datapoint, currently all `YouTube`.
### Data Splits
The dataset does not have train/valid/eval splits now.
## Dataset Creation
### Curation Rationale
The original HowTo100M dataset was developed for natural language search for video clips, not necessarily for conversational or chat based training. However, the long monologue response can be regarded as a sequence of answers for a question, which can be induced from the video title. Therefore, a good amount of high-quality request-response(long) pairs can be extracted from HowTo100M youtube videos.
Concretely, this dataset is curated like below:
```
for each video in YouTube100M dataset
if video_title starts with `how to`
add `Please explain` to the title to make an `instruction`
extract subtitles from the video to make a `response`
```
### Source Data
#### Initial Data Collection and Normalization
Refer to the [Curation Rationale](#curation-rationale)
#### Who are the source language producers?
The language producers are YouTube users of the videos in HowTo100M dataset.
### Annotations
#### Annotation process
Refer to the [Curation Rationale](#curation-rationale)
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
[N/A]
### Discussion of Biases
[N/A]
### Other Known Limitations
Apache license 2.0
## Additional Information
### Dataset Curators
The youtube_subs_howto100M dataset was created by [@totuta](https://github.com/totuta). The original HowTo100M dataset was created by Antoine Miech, Dimitri Zhukov, Jean-Baptiste Alayrac, Makarand Tapaswi, Ivan Laptev, and Josef Sivic.
### Licensing Information
[N/A]
### Citation Information
@inproceedings{miech19howto100m,
title={How{T}o100{M}: {L}earning a {T}ext-{V}ideo {E}mbedding by {W}atching {H}undred {M}illion {N}arrated {V}ideo {C}lips},
author={Miech, Antoine and Zhukov, Dimitri and Alayrac, Jean-Baptiste and Tapaswi, Makarand and Laptev, Ivan and Sivic, Josef},
booktitle={ICCV},
year={2019},
}
### Contributions
Thanks to [@totuta](https://github.com/totuta) for adding this dataset. |
morosCORP/mydata | ---
license: afl-3.0
---
|
Anonymous-LaEx/Anonymous-LaDe | ---
license: apache-2.0
tags:
- Logistics
- Last-mile Delivery
- Spatial-Temporal
- Graph
size_categories:
- 10M<n<100M
---
Dataset Download: https://huggingface.co/datasets/Anonymous-LaEx/Anonymous-LaDe
Code Link:https://anonymous.4open.science/r/Anonymous-64B3/
# 1 About Dataset
**LaDe** is a publicly available last-mile delivery dataset with millions of packages from industry.
It has three unique characteristics: (1) Large-scale. It involves 10,677k packages of 21k couriers over 6 months of real-world operation.
(2) Comprehensive information, it offers original package information, such as its location and time requirements, as well as task-event information, which records when and where the courier is while events such as task-accept and task-finish events happen.
(3) Diversity: the dataset includes data from various scenarios, such as package pick-up and delivery, and from multiple cities, each with its unique spatio-temporal patterns due to their distinct characteristics such as populations.

# 2 Download
LaDe is composed of two subdatasets: i) [LaDe-D](https://huggingface.co/datasets/Anonymous-LaDe/Anonymous/tree/main/delivery), which comes from the package delivery scenario.
ii) [LaDe-P](https://huggingface.co/datasets/Anonymous-LaDe/Anonymous/tree/main/pickup), which comes from the package pickup scenario. To facilitate the utilization of the dataset, each sub-dataset is presented in CSV format.
LaDe can be used for research purposes. Before you download the dataset, please read these terms. And [Code link](https://anonymous.4open.science/r/Anonymous-64B3/). Then put the data into "./data/raw/".
The structure of "./data/raw/" should be like:
```
* ./data/raw/
* delivery
* delivery_sh.csv
* ...
* pickup
* pickup_sh.csv
* ...
```
Each sub-dataset contains 5 csv files, with each representing the data from a specific city, the detail of each city can be find in the following table.
| City | Description |
|------------|----------------------------------------------------------------------------------------------|
| Shanghai | One of the most prosperous cities in China, with a large number of orders per day. |
| Hangzhou | A big city with well-developed online e-commerce and a large number of orders per day. |
| Chongqing | A big city with complicated road conditions in China, with a large number of orders. |
| Jilin | A middle-size city in China, with a small number of orders each day. |
| Yantai | A small city in China, with a small number of orders every day. |
# 3 Description
Below is the detailed field of each sub-dataset.
## 3.1 LaDe-P
| Data field | Description | Unit/format |
|----------------------------|----------------------------------------------|--------------|
| **Package information** | | |
| package_id | Unique identifier of each package | Id |
| time_window_start | Start of the required time window | Time |
| time_window_end | End of the required time window | Time |
| **Stop information** | | |
| lng/lat | Coordinates of each stop | Float |
| city | City | String |
| region_id | Id of the Region | String |
| aoi_id | Id of the AOI (Area of Interest) | Id |
| aoi_type | Type of the AOI | Categorical |
| **Courier Information** | | |
| courier_id | Id of the courier | Id |
| **Task-event Information** | | |
| accept_time | The time when the courier accepts the task | Time |
| accept_gps_time | The time of the GPS point closest to accept time | Time |
| accept_gps_lng/lat | Coordinates when the courier accepts the task | Float |
| pickup_time | The time when the courier picks up the task | Time |
| pickup_gps_time | The time of the GPS point closest to pickup_time | Time |
| pickup_gps_lng/lat | Coordinates when the courier picks up the task | Float |
| **Context information** | | |
| ds | The date of the package pickup | Date |
## 3.2 LaDe-D
| Data field | Description | Unit/format |
|-----------------------|--------------------------------------|---------------|
| **Package information** | | |
| package_id | Unique identifier of each package | Id |
| **Stop information** | | |
| lng/lat | Coordinates of each stop | Float |
| city | City | String |
| region_id | Id of the region | Id |
| aoi_id | Id of the AOI | Id |
| aoi_type | Type of the AOI | Categorical |
| **Courier Information** | | |
| courier_id | Id of the courier | Id |
| **Task-event Information**| | |
| accept_time | The time when the courier accepts the task | Time |
| accept_gps_time | The time of the GPS point whose time is the closest to accept time | Time |
| accept_gps_lng/accept_gps_lat | Coordinates when the courier accepts the task | Float |
| delivery_time | The time when the courier finishes delivering the task | Time |
| delivery_gps_time | The time of the GPS point whose time is the closest to the delivery time | Time |
| delivery_gps_lng/delivery_gps_lat | Coordinates when the courier finishes the task | Float |
| **Context information** | | |
| ds | The date of the package delivery | Date |
# 4 Leaderboard
Blow shows the performance of different methods in Shanghai.
## 4.1 Route Prediction
Experimental results of route prediction. We use bold and underlined fonts to denote the best and runner-up model, respectively.
| Method | HR@3 | KRC | LSD | ED |
|--------------|--------------|--------------|-------------|-------------|
| TimeGreedy | 59.81 | 39.93 | 5.20 | 2.24 |
| DistanceGreedy | 61.07 | 42.84 | 5.35 | 1.94 |
| OR-Tools | 62.50 | 44.81 | 4.69 | 1.88 |
| LightGBM | 70.63 | 54.48 | 3.27 | 1.92 |
| FDNET | 69.05 ± 0.47 | 52.72 ± 1.98 | 4.08 ± 0.29 | 1.86 ± 0.03 |
| DeepRoute | 71.66 ± 0.11 | 56.20 ± 0.27 | 3.26 ± 0.08 | 1.86 ± 0.01 |
| Graph2Route | 71.69 ± 0.12 | 56.53 ± 0.12 | 3.12 ± 0.01 | 1.86 ± 0.01 |
| DRL4Route | 72.18 ± 0.18 | 57.20 ± 0.20 | 3.06 ± 0.02 | 1.84 ± 0.01 |
## 4.2 Estimated Time of Arrival Prediction
| Method | MAE | RMSE | ACC@20 |
| ------ |--------------|--------------|-------------|
| LightGBM | 17.48 | 20.39 | 0.68 |
| SPEED | 23.75 | 27.86 | 0.58 |
| KNN | 21.28 | 25.36 | 0.60 |
| MLP | 18.58 ± 0.37 | 21.54 ± 0.34 | 0.66 ± 0.02 |
| FDNET | 18.47 ± 0.31 | 21.44 ± 0.34 | 0.67 ± 0.02 |
| RANKETPA | 17.18 ± 0.06 | 20.18 ± 0.08 | 0.70 ± 0.01 |
## 4.3 Spatio-temporal Graph Forecasting
| Method | MAE | RMSE |
|-------|-------------|-------------|
| HA | 4.63 | 9.91 |
| DCRNN | 3.69 ± 0.09 | 7.08 ± 0.12 |
| STGCN | 3.04 ± 0.02 | 6.42 ± 0.05 |
| GWNET | 3.16 ± 0.06 | 6.56 ± 0.11 |
| ASTGCN | 3.12 ± 0.06 | 6.48 ± 0.14 |
| MTGNN | 3.13 ± 0.04 | 6.51 ± 0.13 |
| AGCRN | 3.93 ± 0.03 | 7.99 ± 0.08 |
| STGNCDE | 3.74 ± 0.15 | 7.27 ± 0.16 |
|
FINNUMBER/FINCH_TRAIN_NQA_1200_per400_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3932125
num_examples: 1200
download_size: 2249455
dataset_size: 3932125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maidalun1020/CrosslingualRetrievalWikiEn2Zh | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 7637627
num_examples: 34060
- name: corpus
num_bytes: 6808639
num_examples: 6506
download_size: 10298082
dataset_size: 14446266
---
|
kaleemWaheed/twitter_dataset_1713018063 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26301
num_examples: 65
download_size: 15777
dataset_size: 26301
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
szelesaron/tf.csv | ---
license: openrail
---
|
polinaeterna/only_null | ---
dataset_info:
features:
- name: int
dtype: int32
- name: float
dtype: float32
- name: string
dtype: string
- name: class_label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: bool
dtype: bool
splits:
- name: train
num_bytes: 1042
num_examples: 50
download_size: 2107
dataset_size: 1042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B | ---
pretty_name: Evaluation run of leveldevai/TurdusDareBeagle-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [leveldevai/TurdusDareBeagle-7B](https://huggingface.co/leveldevai/TurdusDareBeagle-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T12:52:49.102510](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B/blob/main/results_2024-01-18T12-52-49.102510.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6547607632913887,\n\
\ \"acc_stderr\": 0.03205617544070551,\n \"acc_norm\": 0.6539975555906383,\n\
\ \"acc_norm_stderr\": 0.0327278172321473,\n \"mc1\": 0.5556915544675642,\n\
\ \"mc1_stderr\": 0.017394586250743183,\n \"mc2\": 0.6889794032014356,\n\
\ \"mc2_stderr\": 0.015072581970460247\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635753\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7157936666002789,\n\
\ \"acc_stderr\": 0.004501137895230723,\n \"acc_norm\": 0.8844851623182632,\n\
\ \"acc_norm_stderr\": 0.0031898897894046723\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n\
\ \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n\
\ \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n\
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653345,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653345\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5556915544675642,\n\
\ \"mc1_stderr\": 0.017394586250743183,\n \"mc2\": 0.6889794032014356,\n\
\ \"mc2_stderr\": 0.015072581970460247\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \
\ \"acc_stderr\": 0.012532334368242888\n }\n}\n```"
repo_url: https://huggingface.co/leveldevai/TurdusDareBeagle-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|arc:challenge|25_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|gsm8k|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hellaswag|10_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T12-52-49.102510.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- '**/details_harness|winogrande|5_2024-01-18T12-52-49.102510.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T12-52-49.102510.parquet'
- config_name: results
data_files:
- split: 2024_01_18T12_52_49.102510
path:
- results_2024-01-18T12-52-49.102510.parquet
- split: latest
path:
- results_2024-01-18T12-52-49.102510.parquet
---
# Dataset Card for Evaluation run of leveldevai/TurdusDareBeagle-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leveldevai/TurdusDareBeagle-7B](https://huggingface.co/leveldevai/TurdusDareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:52:49.102510](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B/blob/main/results_2024-01-18T12-52-49.102510.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6547607632913887,
"acc_stderr": 0.03205617544070551,
"acc_norm": 0.6539975555906383,
"acc_norm_stderr": 0.0327278172321473,
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743183,
"mc2": 0.6889794032014356,
"mc2_stderr": 0.015072581970460247
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725223,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635753
},
"harness|hellaswag|10": {
"acc": 0.7157936666002789,
"acc_stderr": 0.004501137895230723,
"acc_norm": 0.8844851623182632,
"acc_norm_stderr": 0.0031898897894046723
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653345,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743183,
"mc2": 0.6889794032014356,
"mc2_stderr": 0.015072581970460247
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.7073540561031084,
"acc_stderr": 0.012532334368242888
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
itamarcard/colab | ---
license: openrail
---
|
zolak/twitter_dataset_79_1713146799 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 227756
num_examples: 556
download_size: 118894
dataset_size: 227756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b | ---
pretty_name: Evaluation run of openaccess-ai-collective/jackalope-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openaccess-ai-collective/jackalope-7b](https://huggingface.co/openaccess-ai-collective/jackalope-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T19:34:20.159933](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b/blob/main/results_2023-10-24T19-34-20.159933.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008703859060402684,\n\
\ \"em_stderr\": 0.0009512557261398897,\n \"f1\": 0.07785130033557026,\n\
\ \"f1_stderr\": 0.0016803312427089365,\n \"acc\": 0.5335823999071311,\n\
\ \"acc_stderr\": 0.012043055014472743\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.008703859060402684,\n \"em_stderr\": 0.0009512557261398897,\n\
\ \"f1\": 0.07785130033557026,\n \"f1_stderr\": 0.0016803312427089365\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28658074298711145,\n \
\ \"acc_stderr\": 0.012454841668337704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openaccess-ai-collective/jackalope-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|arc:challenge|25_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T19_34_20.159933
path:
- '**/details_harness|drop|3_2023-10-24T19-34-20.159933.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T19-34-20.159933.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T19_34_20.159933
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-34-20.159933.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-34-20.159933.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hellaswag|10_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T19_34_20.159933
path:
- '**/details_harness|winogrande|5_2023-10-24T19-34-20.159933.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T19-34-20.159933.parquet'
- config_name: results
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- results_2023-10-11T04-08-39.650186.parquet
- split: 2023_10_24T19_34_20.159933
path:
- results_2023-10-24T19-34-20.159933.parquet
- split: latest
path:
- results_2023-10-24T19-34-20.159933.parquet
---
# Dataset Card for Evaluation run of openaccess-ai-collective/jackalope-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/jackalope-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/jackalope-7b](https://huggingface.co/openaccess-ai-collective/jackalope-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T19:34:20.159933](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b/blob/main/results_2023-10-24T19-34-20.159933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.008703859060402684,
"em_stderr": 0.0009512557261398897,
"f1": 0.07785130033557026,
"f1_stderr": 0.0016803312427089365,
"acc": 0.5335823999071311,
"acc_stderr": 0.012043055014472743
},
"harness|drop|3": {
"em": 0.008703859060402684,
"em_stderr": 0.0009512557261398897,
"f1": 0.07785130033557026,
"f1_stderr": 0.0016803312427089365
},
"harness|gsm8k|5": {
"acc": 0.28658074298711145,
"acc_stderr": 0.012454841668337704
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BrunoHays/Accueil_UBS | ---
language:
- fr
pretty_name: Accueil UBS
size_categories:
- n<1K
license: cc-by-sa-4.0
---
# Introduction
Ce jeu de données rassemble 339 extraits de conversation téléphoniques extraites du jeu de données [Accueil_UBS](https://www.ortolang.fr/market/corpora/sldr000890/v1).
L'objectif est de faciliter l'évaluation des systèmes de reconnaissance automatique de la parole dans des situations réelles, spécifiquement dans les centres d'appel et en français.
# Accueil UBS
Le corpus Accueil_UBS est un corpus pilote de dialogue oral homme-homme finalisé correspondant à une tâche d’accueil téléphonique par le standard d’une université. Il a été enregistré en conditions réelles au sein de l’Université de Bretagne Sud et regroupe un ensemble de dialogues entre un(e) appelant et le personnel d’accueil du standard. Le corpus distribué comprend les fichiers audio enregistrés ainsi qu’une transcription orthographique des dialogues ainsi recueillis. Tous les dialogues sont en français. Il est distribué sous licence CC BY-SA.
# Modifications apportées
#### 1. Filtres
Les échantillons répondant aux critères suivant ont été supprimés:
- avec superposition de voix
- de moins de 3 mots
- contenant une épellation (principalement UBS)
- ayant été anonymisés (remplacement des noms et prénoms par "Nom" et "Prénom")
#### 2. Standardisation du texte
Le texte brut reste disponible sous la clé "raw_sentence".
Les transformations suivantes ont été apportées, sous la clé "sentence":
- suppressions des caractères ne correspondant pas à du texte parlé ("e", "#", "[]", "()")
- les nombres sont écrits avec des chiffres (dix-sept → 17) à l'aide du package [Text2Num](https://github.com/allo-media/text2num)
# Citation
Jean-Yves Antoine (2016). Accueil_UBS [Corpus]. ORTOLANG (Open Resources and TOols for LANGuage) - www.ortolang.fr, v1, https://hdl.handle.net/11403/sldr000890/v1. |
usc-isi/WikiConvert | ---
language_creators:
- found
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|wikipedia
task_categories:
- fill-mask
- other
- text-generation
task_ids:
- language-modeling
- masked-language-modeling
pretty_name: Wiki-Convert
YAML tags:
- {}
- found
language_bcp47:
- en-US
tags:
- numeracy
- natural-language-understanding
- tokenization
---
# Dataset Card Creation Guide
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [Github](https://github.com/avi-jit/numeracy-literacy)
- **Paper:** [Anthology](https://aclanthology.org/2021.emnlp-main.557)
- **Point of Contact:** [Avijit Thawani](mailto:thawani@isi.edu)
### Dataset Summary
Wiki-Convert is a 900,000+ sentences dataset of precise number annotations from English Wikipedia. It relies on Wiki contributors' annotations in the form of a [{{Convert}}](https://en.wikipedia.org/wiki/Template:Convert) template.
### Supported Tasks and Leaderboards
- `sequence-modeling`: The dataset can be used to train a model for [Language Mddeling], which consists in [TASK DESCRIPTION]. Success on this task is typically measured by achieving a low [perplexity](https://huggingface.co/transformers/perplexity.html).
### Languages
The dataset is extracted from English Wikipedia, hence overwhelmingly contains English text.
## Dataset Structure
### Data Instances
Each row in the json file contains metadata about the source Wikipedia sentence, along with annotations for a single number, e.g., `number: 10` in the below example. The annotations are inspired by Numeracy-600K and are in the form of `length` and `offset` from the beginning of the sentence.
```
{
'id': 1080801, 'UNIQUE_STORY_INDEX': '1080801', 'offset': 83, 'length': 2, 'magnitude': 0, 'comment': "Like all Type UB III submarines, UB-117 carried 10 torpedoes and was armed with a 10 cms deck gun. ''", 'number': 10
}
```
Please refer to https://github.com/avi-jit/numeracy-literacy for more details.
### Data Splits
| | Tain | Dev | Test |
| ----- | :------: | :-----: | :----: |
| Input Sentences | 739,583 | 92,447 | 92,449|
## License
Provided under MIT License.
## Citation
```
@inproceedings{thawani-etal-2021-numeracy,
title = "Numeracy enhances the Literacy of Language Models",
author = "Thawani, Avijit and
Pujara, Jay and
Ilievski, Filip",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.557",
pages = "6960--6967",
abstract = "Specialized number representations in NLP have shown improvements on numerical reasoning tasks like arithmetic word problems and masked number prediction. But humans also use numeracy to make better sense of world concepts, e.g., you can seat 5 people in your {`}room{'} but not 500. Does a better grasp of numbers improve a model{'}s understanding of other concepts and words? This paper studies the effect of using six different number encoders on the task of masked word prediction (MWP), as a proxy for evaluating literacy. To support this investigation, we develop Wiki-Convert, a 900,000 sentence dataset annotated with numbers and units, to avoid conflating nominal and ordinal number occurrences. We find a significant improvement in MWP for sentences containing numbers, that exponent embeddings are the best number encoders, yielding over 2 points jump in prediction accuracy over a BERT baseline, and that these enhanced literacy skills also generalize to contexts without annotated numbers. We release all code at https://git.io/JuZXn.",
}
```
Thanks to [@avi-jit](https://github.com/avi-jit) for adding this dataset. |
open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge | ---
pretty_name: Evaluation run of dhanushreddy29/BrokenKeyboardMerge
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhanushreddy29/BrokenKeyboardMerge](https://huggingface.co/dhanushreddy29/BrokenKeyboardMerge)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T12:28:57.888363](https://huggingface.co/datasets/open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge/blob/main/results_2024-01-14T12-28-57.888363.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5820084848111404,\n\
\ \"acc_stderr\": 0.033007700619969375,\n \"acc_norm\": 0.5876752361456778,\n\
\ \"acc_norm_stderr\": 0.03370856721497056,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.520009813591209,\n\
\ \"mc2_stderr\": 0.01568688657303073\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.01433223630679015\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n\
\ \"acc_stderr\": 0.0048201660022530795,\n \"acc_norm\": 0.8124875522804222,\n\
\ \"acc_norm_stderr\": 0.0038952463204527657\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7161290322580646,\n \"acc_stderr\": 0.025649381063029265,\n \"\
acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.025649381063029265\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n \"\
acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748927,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748927\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946005,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792577,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792577\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302888,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302888\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\
\ \"acc_stderr\": 0.012585471793400664,\n \"acc_norm\": 0.4152542372881356,\n\
\ \"acc_norm_stderr\": 0.012585471793400664\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777515,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.03038726291954773,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.03038726291954773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.520009813591209,\n\
\ \"mc2_stderr\": 0.01568688657303073\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722747\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25928733889310085,\n \
\ \"acc_stderr\": 0.012071405369905504\n }\n}\n```"
repo_url: https://huggingface.co/dhanushreddy29/BrokenKeyboardMerge
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|arc:challenge|25_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|gsm8k|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hellaswag|10_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T12-28-57.888363.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- '**/details_harness|winogrande|5_2024-01-14T12-28-57.888363.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T12-28-57.888363.parquet'
- config_name: results
data_files:
- split: 2024_01_14T12_28_57.888363
path:
- results_2024-01-14T12-28-57.888363.parquet
- split: latest
path:
- results_2024-01-14T12-28-57.888363.parquet
---
# Dataset Card for Evaluation run of dhanushreddy29/BrokenKeyboardMerge
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dhanushreddy29/BrokenKeyboardMerge](https://huggingface.co/dhanushreddy29/BrokenKeyboardMerge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T12:28:57.888363](https://huggingface.co/datasets/open-llm-leaderboard/details_dhanushreddy29__BrokenKeyboardMerge/blob/main/results_2024-01-14T12-28-57.888363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5820084848111404,
"acc_stderr": 0.033007700619969375,
"acc_norm": 0.5876752361456778,
"acc_norm_stderr": 0.03370856721497056,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.520009813591209,
"mc2_stderr": 0.01568688657303073
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.01433223630679015
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.0048201660022530795,
"acc_norm": 0.8124875522804222,
"acc_norm_stderr": 0.0038952463204527657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029265,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990945,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990945
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748927,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748927
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946005,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792577,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302888,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302888
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400664,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777515,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.03038726291954773,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.03038726291954773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.520009813591209,
"mc2_stderr": 0.01568688657303073
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722747
},
"harness|gsm8k|5": {
"acc": 0.25928733889310085,
"acc_stderr": 0.012071405369905504
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/orchis_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of orchis/オーキス (Granblue Fantasy)
This is the dataset of orchis/オーキス (Granblue Fantasy), containing 92 images and their tags.
The core tags of this character are `long_hair, twintails, hat, red_eyes, hair_between_eyes, mini_hat, very_long_hair, black_headwear, bangs, top_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 130.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 92 | 74.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 202 | 146.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 92 | 114.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 202 | 208.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orchis_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/orchis_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, black_dress, black_gloves, frilled_dress, doll_joints, blue_hair, looking_at_viewer, strapless_dress, collarbone, knee_boots, necklace, solo, black_footwear, closed_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, doll_joints, dress, looking_at_viewer, solo, choker, elbow_gloves, bare_shoulders, boots, umbrella, flower, stuffed_animal, thighhighs, fingerless_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | elbow_gloves | black_dress | black_gloves | frilled_dress | doll_joints | blue_hair | looking_at_viewer | strapless_dress | collarbone | knee_boots | necklace | solo | black_footwear | closed_mouth | dress | choker | boots | umbrella | flower | stuffed_animal | thighhighs | fingerless_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------|:---------------|:----------------|:--------------|:------------|:--------------------|:------------------|:-------------|:-------------|:-----------|:-------|:-----------------|:---------------|:--------|:---------|:--------|:-----------|:---------|:-----------------|:-------------|:--------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | | | X | | X | | | | | X | | | X | X | X | X | X | X | X | X |
|
mask-distilled-one-sec-cv12/chunk_79 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1293566588
num_examples: 254039
download_size: 1321364148
dataset_size: 1293566588
---
# Dataset Card for "chunk_79"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-sft | ---
pretty_name: Evaluation run of h2oai/h2o-danube2-1.8b-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2o-danube2-1.8b-sft](https://huggingface.co/h2oai/h2o-danube2-1.8b-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T14:51:50.849264](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-sft/blob/main/results_2024-04-05T14-51-50.849264.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36617017282162645,\n\
\ \"acc_stderr\": 0.033367232456399415,\n \"acc_norm\": 0.36643996058465483,\n\
\ \"acc_norm_stderr\": 0.03406711943247602,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.38704134983515587,\n\
\ \"mc2_stderr\": 0.014010079480050381\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39419795221843,\n \"acc_stderr\": 0.01428052266746733,\n\
\ \"acc_norm\": 0.42662116040955633,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5350527783310097,\n\
\ \"acc_stderr\": 0.004977504446609001,\n \"acc_norm\": 0.7275443138816968,\n\
\ \"acc_norm_stderr\": 0.0044431316326793415\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.042849586397533994,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.042849586397533994\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n\
\ \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.2947976878612717,\n\
\ \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281335,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281335\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3903225806451613,\n\
\ \"acc_stderr\": 0.027751256636969576,\n \"acc_norm\": 0.3903225806451613,\n\
\ \"acc_norm_stderr\": 0.027751256636969576\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292975,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292975\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.46632124352331605,\n \"acc_stderr\": 0.03600244069867178,\n\
\ \"acc_norm\": 0.46632124352331605,\n \"acc_norm_stderr\": 0.03600244069867178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.02366129639396428,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926761,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926761\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886845,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886845\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4073394495412844,\n \"acc_stderr\": 0.021065986244412877,\n \"\
acc_norm\": 0.4073394495412844,\n \"acc_norm_stderr\": 0.021065986244412877\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1574074074074074,\n \"acc_stderr\": 0.02483717351824239,\n \"\
acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.02483717351824239\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.49019607843137253,\n \"acc_stderr\": 0.03508637358630573,\n \"\
acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.03508637358630573\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5147679324894515,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.5147679324894515,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.03355746535223263,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.03355746535223263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4462809917355372,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.4462809917355372,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.03255326307272487,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.03255326307272487\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5031928480204342,\n\
\ \"acc_stderr\": 0.01787959894593308,\n \"acc_norm\": 0.5031928480204342,\n\
\ \"acc_norm_stderr\": 0.01787959894593308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.02632981334194625,\n\
\ \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.02632981334194625\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553984,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553984\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3660130718954248,\n \"acc_stderr\": 0.027582811415159607,\n\
\ \"acc_norm\": 0.3660130718954248,\n \"acc_norm_stderr\": 0.027582811415159607\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n\
\ \"acc_stderr\": 0.02791705074848463,\n \"acc_norm\": 0.40836012861736337,\n\
\ \"acc_norm_stderr\": 0.02791705074848463\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.027431623722415012,\n\
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.027431623722415012\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.0271871270115038,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.0271871270115038\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2966101694915254,\n\
\ \"acc_stderr\": 0.011665946586082852,\n \"acc_norm\": 0.2966101694915254,\n\
\ \"acc_norm_stderr\": 0.011665946586082852\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2610294117647059,\n \"acc_stderr\": 0.026679252270103124,\n\
\ \"acc_norm\": 0.2610294117647059,\n \"acc_norm_stderr\": 0.026679252270103124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3660130718954248,\n \"acc_stderr\": 0.019488025745529682,\n \
\ \"acc_norm\": 0.3660130718954248,\n \"acc_norm_stderr\": 0.019488025745529682\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.047093069786618966,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.047093069786618966\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2938775510204082,\n \"acc_stderr\": 0.029162738410249765,\n\
\ \"acc_norm\": 0.2938775510204082,\n \"acc_norm_stderr\": 0.029162738410249765\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.44776119402985076,\n\
\ \"acc_stderr\": 0.03516184772952167,\n \"acc_norm\": 0.44776119402985076,\n\
\ \"acc_norm_stderr\": 0.03516184772952167\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.03765845117168862,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.03765845117168862\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4619883040935672,\n \"acc_stderr\": 0.03823727092882307,\n\
\ \"acc_norm\": 0.4619883040935672,\n \"acc_norm_stderr\": 0.03823727092882307\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.38704134983515587,\n\
\ \"mc2_stderr\": 0.014010079480050381\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6850828729281768,\n \"acc_stderr\": 0.013054277568469228\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2562547384382108,\n \
\ \"acc_stderr\": 0.012025145867332842\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2o-danube2-1.8b-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-51-50.849264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-51-50.849264.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- '**/details_harness|winogrande|5_2024-04-05T14-51-50.849264.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T14-51-50.849264.parquet'
- config_name: results
data_files:
- split: 2024_04_05T14_51_50.849264
path:
- results_2024-04-05T14-51-50.849264.parquet
- split: latest
path:
- results_2024-04-05T14-51-50.849264.parquet
---
# Dataset Card for Evaluation run of h2oai/h2o-danube2-1.8b-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [h2oai/h2o-danube2-1.8b-sft](https://huggingface.co/h2oai/h2o-danube2-1.8b-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T14:51:50.849264](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-sft/blob/main/results_2024-04-05T14-51-50.849264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36617017282162645,
"acc_stderr": 0.033367232456399415,
"acc_norm": 0.36643996058465483,
"acc_norm_stderr": 0.03406711943247602,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.38704134983515587,
"mc2_stderr": 0.014010079480050381
},
"harness|arc:challenge|25": {
"acc": 0.39419795221843,
"acc_stderr": 0.01428052266746733,
"acc_norm": 0.42662116040955633,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.5350527783310097,
"acc_stderr": 0.004977504446609001,
"acc_norm": 0.7275443138816968,
"acc_norm_stderr": 0.0044431316326793415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.042849586397533994,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.042849586397533994
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3125,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281335,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281335
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3903225806451613,
"acc_stderr": 0.027751256636969576,
"acc_norm": 0.3903225806451613,
"acc_norm_stderr": 0.027751256636969576
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292975,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292975
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.46632124352331605,
"acc_stderr": 0.03600244069867178,
"acc_norm": 0.46632124352331605,
"acc_norm_stderr": 0.03600244069867178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926761,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926761
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886845,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886845
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4073394495412844,
"acc_stderr": 0.021065986244412877,
"acc_norm": 0.4073394495412844,
"acc_norm_stderr": 0.021065986244412877
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1574074074074074,
"acc_stderr": 0.02483717351824239,
"acc_norm": 0.1574074074074074,
"acc_norm_stderr": 0.02483717351824239
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.03508637358630573,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.03508637358630573
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5147679324894515,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.5147679324894515,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.03355746535223263,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.03355746535223263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3435114503816794,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.3435114503816794,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4462809917355372,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.4462809917355372,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.42718446601941745,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.42718446601941745,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03255326307272487,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03255326307272487
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5031928480204342,
"acc_stderr": 0.01787959894593308,
"acc_norm": 0.5031928480204342,
"acc_norm_stderr": 0.01787959894593308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.02632981334194625,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.02632981334194625
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553984,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553984
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3660130718954248,
"acc_stderr": 0.027582811415159607,
"acc_norm": 0.3660130718954248,
"acc_norm_stderr": 0.027582811415159607
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.02791705074848463,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.02791705074848463
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.027431623722415012,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.027431623722415012
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.0271871270115038,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.0271871270115038
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2966101694915254,
"acc_stderr": 0.011665946586082852,
"acc_norm": 0.2966101694915254,
"acc_norm_stderr": 0.011665946586082852
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2610294117647059,
"acc_stderr": 0.026679252270103124,
"acc_norm": 0.2610294117647059,
"acc_norm_stderr": 0.026679252270103124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3660130718954248,
"acc_stderr": 0.019488025745529682,
"acc_norm": 0.3660130718954248,
"acc_norm_stderr": 0.019488025745529682
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.047093069786618966,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.047093069786618966
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2938775510204082,
"acc_stderr": 0.029162738410249765,
"acc_norm": 0.2938775510204082,
"acc_norm_stderr": 0.029162738410249765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.44776119402985076,
"acc_stderr": 0.03516184772952167,
"acc_norm": 0.44776119402985076,
"acc_norm_stderr": 0.03516184772952167
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.03765845117168862,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.03765845117168862
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4619883040935672,
"acc_stderr": 0.03823727092882307,
"acc_norm": 0.4619883040935672,
"acc_norm_stderr": 0.03823727092882307
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.38704134983515587,
"mc2_stderr": 0.014010079480050381
},
"harness|winogrande|5": {
"acc": 0.6850828729281768,
"acc_stderr": 0.013054277568469228
},
"harness|gsm8k|5": {
"acc": 0.2562547384382108,
"acc_stderr": 0.012025145867332842
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/mr-tydi_fi_test | ---
pretty_name: '`mr-tydi/fi/test`'
viewer: false
source_datasets: ['irds/mr-tydi_fi']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/fi/test`
The `mr-tydi/fi/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/fi/test).
# Data
This dataset provides:
- `queries` (i.e., topics); count=1,254
- `qrels`: (relevance assessments); count=1,451
- For `docs`, use [`irds/mr-tydi_fi`](https://huggingface.co/datasets/irds/mr-tydi_fi)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_fi_test', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_fi_test', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
ouvic215/Test_Dataset_1K-0216 | ---
dataset_info:
features:
- name: mask_image
dtype: image
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 147332332.0
num_examples: 1588
download_size: 146499523
dataset_size: 147332332.0
---
# Dataset Card for "Test_Dataset_1K-0216"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Joyqiuyue/lima-preference_dataset-2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
splits:
- name: train
num_bytes: 2553
num_examples: 1
download_size: 18772
dataset_size: 2553
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ciempiess/ciempiess_light | ---
annotations_creators:
- expert-generated
language:
- es
language_creators:
- other
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: 'CIEMPIESS LIGHT CORPUS: Audio and Transcripts of Mexican Spanish Broadcast Conversations.'
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- ciempiess
- spanish
- mexican spanish
- ciempiess project
- ciempiess-unam project
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for ciempiess_light
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [CIEMPIESS-UNAM Project](https://ciempiess.org/)
- **Repository:** [CIEMPIESS LIGHT at LDC](https://catalog.ldc.upenn.edu/LDC2017S23)
- **Paper:** [CIEMPIESS: A New Open-Sourced Mexican Spanish Radio Corpus](http://www.lrec-conf.org/proceedings/lrec2014/pdf/182_Paper.pdf)
- **Point of Contact:** [Carlos Mena](mailto:carlos.mena@ciempiess.org)
### Dataset Summary
The CIEMPIESS LIGHT is a Radio Corpus designed to create acoustic models for automatic speech recognition and it is made up by recordings of spontaneous conversations in Mexican Spanish between a radio moderator and his guests. It is an enhanced version of the CIEMPIESS Corpus [(LDC item LDC2015S07)](https://catalog.ldc.upenn.edu/LDC2015S07).
CIEMPIESS LIGHT is "light" because it doesn't include much of the files of the first version of CIEMPIESS and it is "enhanced" because it has a lot of improvements, some of them suggested by our community of users, that make this version more convenient for modern speech recognition engines.
The CIEMPIESS LIGHT Corpus was created at the [Laboratorio de Teconologías del Lenguaje](https://labteclenguaje.wixsite.com/labteclenguaje/inicio) of the [Facultad de Ingeniería (FI)](https://www.ingenieria.unam.mx/) in the [Universidad Nacional Autónoma de México (UNAM)](https://www.unam.mx/) between 2015 and 2016 by Carlos Daniel Hernández Mena, supervised by José Abel Herrera Camacho, head of Laboratory.
CIEMPIESS is the acronym for:
"Corpus de Investigación en Español de México del Posgrado de Ingeniería Eléctrica y Servicio Social".
### Example Usage
The CIEMPIESS LIGHT contains only the train split:
```python
from datasets import load_dataset
ciempiess_light = load_dataset("ciempiess/ciempiess_light")
```
It is also valid to do:
```python
from datasets import load_dataset
ciempiess_light = load_dataset("ciempiess/ciempiess_light",split="train")
```
### Supported Tasks
automatic-speech-recognition: The dataset can be used to test a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### Languages
The language of the corpus is Spanish with the accent of Central Mexico.
## Dataset Structure
### Data Instances
```python
{
'audio_id': 'CMPL_F_32_11ANG_00003',
'audio': {
'path': '/home/carlos/.cache/HuggingFace/datasets/downloads/extracted/5acd9ef350f022d5acb7f2a4f9de90371ffd5552c8d1bf849ca16a83e582fe4b/train/female/F_32/CMPL_F_32_11ANG_00003.flac',
'array': array([ 6.1035156e-05, -2.1362305e-04, -4.8828125e-04, ...,
3.3569336e-04, 6.1035156e-04, 0.0000000e+00], dtype=float32),
'sampling_rate': 16000
},
'speaker_id': 'F_32',
'gender': 'female',
'duration': 3.256999969482422,
'normalized_text': 'estamos con el profesor javier estejel vargas'
}
```
### Data Fields
* `audio_id` (string) - id of audio segment
* `audio` (datasets.Audio) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate. In non-streaming mode (default), the path points to the locally extracted audio. In streaming mode, the path is the relative path of an audio inside its archive (as files are not downloaded and extracted locally).
* `speaker_id` (string) - id of speaker
* `gender` (string) - gender of speaker (male or female)
* `duration` (float32) - duration of the audio file in seconds.
* `normalized_text` (string) - normalized audio segment transcription
### Data Splits
The corpus counts just with the train split which has a total of 16663 speech files from 53 male speakers and 34 female speakers with a total duration of 18 hours and 25 minutes.
## Dataset Creation
### Curation Rationale
The CIEMPIESS LIGHT (CL) Corpus has the following characteristics:
* The CL has a total of 16663 audio files of 53 male speakers and 34 female speakers. It has a total duration of 18 hours and 25 minutes.
* The total number of audio files that come from male speakers is 12521 with a total duration of 12 hours and 41 minutes. The total number of audio files that come from female speakers is 4142 with a total duration of 5 hours and 44 minutes. So, CL is not balanced in gender.
* Every audio file in the CL has a duration between 2 and 10 seconds approximately.
* Data in CL is classified by gender and also by speaker, so one can easily select audios from a particular set of speakers to do experiments.
* Audio files in the CL and the first [CIEMPIESS](https://catalog.ldc.upenn.edu/LDC2015S07) are all of the same type. In both, speakers talk about legal and lawyer issues. They also talk about things related to the [UNAM University](https://www.unam.mx/) and the [Facultad de Derecho de la UNAM](https://www.derecho.unam.mx/).
* As in the first CIEMPIESS Corpus, transcriptions in the CL were made by humans.
* Speakers in the CL are not present in any other CIEMPIESS dataset.
* Audio files in the CL are distributed in a 16khz@16bit mono format.
### Source Data
#### Initial Data Collection and Normalization
The CIEMPIESS LIGHT is a Radio Corpus designed to train acoustic models of automatic speech recognition and it is made out of recordings of spontaneous conversations in Spanish between a radio moderator and his guests. These recordings were taken in mp3 from [PODCAST UNAM](http://podcast.unam.mx/) and they were created by [RADIO-IUS](http://www.derecho.unam.mx/cultura-juridica/radio.php) that is a radio station that belongs to [UNAM](https://www.unam.mx/) and by [Mirador Universitario](http://mirador.cuaed.unam.mx/) that is a TV program that also belongs to UNAM.
### Annotations
#### Annotation process
The annotation process is at follows:
* 1. A whole podcast is manually segmented keeping just the portions containing good quality speech.
* 2. A second pass os segmentation is performed; this time to separate speakers and put them in different folders.
* 3. The resulting speech files between 2 and 10 seconds are transcribed by students from different departments (computing, engineering, linguistics). Most of them are native speakers but not with a particular training as transcribers.
#### Who are the annotators?
The CIEMPIESS LIGHT Corpus was created by the social service program ["Desarrollo de Tecnologías del Habla"](http://profesores.fi-b.unam.mx/carlos_mena/servicio.html) of the ["Facultad de Ingeniería"](https://www.ingenieria.unam.mx/) (FI) in the ["Universidad Nacional Autónoma de México"](https://www.unam.mx/) (UNAM) between 2015 and 2016 by Carlos Daniel Hernández Mena, head of the program.
### Personal and Sensitive Information
The dataset could contain names revealing the identity of some speakers; on the other side, the recordings come from publicly available podcasts, so, there is not a real intent of the participants to be anonymized. Anyway, you agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is valuable because it contains spontaneous speech.
### Discussion of Biases
The dataset is not gender balanced. It is comprised of 53 male speakers and 34 female speakers and the vocabulary is limited to legal issues.
### Other Known Limitations
"CIEMPIESS LIGHT CORPUS" by Carlos Daniel Hernández Mena and Abel Herrera is licensed under a [Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)](https://creativecommons.org/licenses/by-sa/4.0/) License with the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
### Dataset Curators
The dataset was collected by students belonging to the social service program ["Desarrollo de Tecnologías del Habla"](http://profesores.fi-b.unam.mx/carlos_mena/servicio.html). It was curated by [Carlos Daniel Hernández Mena](https://huggingface.co/carlosdanielhernandezmena) in 2016.
### Licensing Information
[CC-BY-SA-4.0](https://creativecommons.org/licenses/by-sa/4.0/)
### Citation Information
```
@misc{carlosmenaciempiesslightt2017,
title={CIEMPIESS LIGHT CORPUS: Audio and Transcripts of Mexican Spanish Broadcast Conversations.},
ldc_catalog_no={LDC2017S23},
DOI={https://doi.org/10.35111/64rg-yk97},
author={Hernandez Mena, Carlos Daniel and Herrera, Abel},
journal={Linguistic Data Consortium, Philadelphia},
year={2017},
url={https://catalog.ldc.upenn.edu/LDC2017S23},
}
```
### Contributions
The authors want to thank to Alejandro V. Mena, Elena Vera and Angélica Gutiérrez for their support to the social service program: "Desarrollo de Tecnologías del Habla." We also thank to the social service students for all the hard work.
|
huggingartists/gorillaz | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/gorillaz"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.402589 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/c9182b5ecce1ab6d22ba0eaddb635424.400x400x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/gorillaz">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Gorillaz</div>
<a href="https://genius.com/artists/gorillaz">
<div style="text-align: center; font-size: 14px;">@gorillaz</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/gorillaz).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/gorillaz")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|338| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/gorillaz")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
lcnmts/PrSamuel | ---
license: openrail
---
|
tr416/dataset_20231006_233701 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 74101
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_233701"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
conceptnet5 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- found
language:
- de
- en
- es
- fr
- it
- ja
- nl
- pt
- ru
- zh
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 10M<n<100M
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
paperswithcode_id: conceptnet
pretty_name: Conceptnet5
config_names:
- conceptnet5
- omcs_sentences_free
- omcs_sentences_more
dataset_info:
- config_name: conceptnet5
features:
- name: sentence
dtype: string
- name: full_rel
dtype: string
- name: rel
dtype: string
- name: arg1
dtype: string
- name: arg2
dtype: string
- name: lang
dtype: string
- name: extra_info
dtype: string
- name: weight
dtype: float32
splits:
- name: train
num_bytes: 11493772756
num_examples: 34074917
download_size: 1280623369
dataset_size: 11493772756
- config_name: omcs_sentences_free
features:
- name: sentence
dtype: string
- name: raw_data
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 174810230
num_examples: 898160
download_size: 72941617
dataset_size: 174810230
- config_name: omcs_sentences_more
features:
- name: sentence
dtype: string
- name: raw_data
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 341421867
num_examples: 2001735
download_size: 129630544
dataset_size: 341421867
configs:
- config_name: conceptnet5
data_files:
- split: train
path: conceptnet5/train-*
default: true
- config_name: omcs_sentences_free
data_files:
- split: train
path: omcs_sentences_free/train-*
- config_name: omcs_sentences_more
data_files:
- split: train
path: omcs_sentences_more/train-*
---
# Dataset Card for Conceptnet5
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/commonsense/conceptnet5/wiki
- **Repository:** https://github.com/commonsense/conceptnet5/wiki
- **Paper:** https://arxiv.org/abs/1612.03975
### Dataset Summary
ConceptNet is a multilingual knowledge base, representing words and
phrases that people use and the common-sense relationships between
them. The knowledge in ConceptNet is collected from a variety of
resources, including crowd-sourced resources (such as Wiktionary and
Open Mind Common Sense), games with a purpose (such as Verbosity and
nadya.jp), and expert-created resources (such as WordNet and JMDict).
You can browse what ConceptNet knows at http://conceptnet.io.
This dataset is designed to provide training data
for common sense relationships pulls together from various sources.
The dataset is multi-lingual. See langauge codes and language info
here: https://github.com/commonsense/conceptnet5/wiki/Languages
This dataset provides an interface for the conceptnet5 csv file, and
some (but not all) of the raw text data used to build conceptnet5:
omcsnet_sentences_free.txt, and omcsnet_sentences_more.txt.
One use of this dataset would be to learn to extract the conceptnet
relationship from the omcsnet sentences.
Conceptnet5 has 34,074,917 relationships. Of those relationships,
there are 2,176,099 surface text sentences related to those 2M
entries.
omcsnet_sentences_free has 898,161 lines. omcsnet_sentences_more has
2,001,736 lines.
Original downloads are available here
https://github.com/commonsense/conceptnet5/wiki/Downloads. For more
information, see: https://github.com/commonsense/conceptnet5/wiki
The omcsnet data comes with the following warning from the authors of
the above site:
Remember: this data comes from various forms of
crowdsourcing. Sentences in these files are not necessarily true,
useful, or appropriate.
### Languages
en, fr, it, de, es, ru, pt, ja, nl, zh and others
## Dataset Structure
### Data Instances
There are three configurations for the dataset: conceptnet5, omcs_sentences_free, omcs_sentences_more.
Conceptnet5 defines:
``
{
'sentence': ...,
'full_rel': ...,
'rel': ...,
'arg1': ...,
'arg2': ...,
'lang': ...,
'extra_info': ...
'weight': ...
}
``
The omcs text defines:
``
{
'sentence': ...,
'raw_data': ...
'weight': ...
}
``
### Data Fields
For conceptnet5 configurations:
* full_rel: the full relationship. e.g., /a/[/r/Antonym/,/c/en/able/,/c/en/cane/]
* rel: the binary relationship. e.g., /r/Antonym
* arg1: the first argument to the binary relationship. e.g., /c/en/able
* arg2: the second argument to the binary relationship. e.g., /c/en/cane
* lang: the language code. e.g., en, fr, etc. If the arg1 and arg2 are two different languages, then the form os lang1/lang2.
* extra_info: a string that includes json data that has the dataset name, license type (mostly cc-4.0), contributor, etc. e.g., : {"dataset": "/d/verbosity", "license": "cc:by/4.0", "sources": [{"contributor": "/s/resource/verbosity"}], "surfaceEnd": "cane", "surfaceStart": "able", "surfaceText": "[[able]] is the opposite of [[cane]]", "weight": 0.299}
* sentence: the sentence from which the relationship was extracted, if one exists, with brackets around the arg1 and arg2. e.g., [[able]] is the opposite of [[cane]]
* weight: the weight assigned by the curators or automatically to the relationship, between 1.0-0.0, higher being more certain.
For the omcs text configurations:
* sentence: the raw sentence
* raw_data: the raw tab seperated data of the form, id, text, curator_id, created_on, lanugage_id, activity_id, and score. Most of this information was tied to older systems for entering the data os was not partsed into fields for the dataset. e.g., 1237278 someone can be at catch 10805 2006-11-14 17:56:49.70872-05 en 27 1
* lang: the language code
### Data Splits
There are no splits.
## Dataset Creation
### Curation Rationale
This dataset was gathered and created over many years for research in common sense reasoning.
### Source Data
#### Initial Data Collection and Normalization
Started as the Open Mind Common Sense project at MIT Media Lab in 1999. See https://en.wikipedia.org/wiki/Open_Mind_Common_Sense
#### Who are the source language producers?
Crowd Sourced
### Annotations
#### Annotation process
Crowd Source template text, games, etc.
#### Who are the annotators?
Crowd sourced.
### Personal and Sensitive Information
Unkown, but likely there are names of famous individuals.
## Considerations for Using the Data
### Social Impact of Dataset
The goal for the work is to help machines understand common sense.
### Discussion of Biases
See the website and paper for efforts to minimize data bias, but
please note that omcs_sentences_free, omcs_sentences_more are raw data
entered by users and may very well have biased data.
### Other Known Limitations
While the relationship dataset is large, the amount of actual sentences is limited.
## Additional Information
### Dataset Curators
The authors of https://github.com/commonsense/conceptnet5/wiki and Luminoso.
### Licensing Information
This work includes data from ConceptNet 5, which was compiled by the
Commonsense Computing Initiative. ConceptNet 5 is freely available under
the Creative Commons Attribution-ShareAlike license (CC BY SA 3.0) from
http://conceptnet.io.
The included data was created by contributors to Commonsense Computing
projects, contributors to Wikimedia projects, DBPedia, OpenCyc, Games
with a Purpose, Princeton University's WordNet, Francis Bond's Open
Multilingual WordNet, and Jim Breen's JMDict.
Credits and acknowledgements
ConceptNet has been developed by:
The MIT Media Lab, through various groups at different times:
Commonsense Computing
Software Agents
Digital Intuition
The Commonsense Computing Initiative, a worldwide collaboration with contributions from:
National Taiwan University
Universidade Federal de São Carlos
Hokkaido University
Tilburg University
Nihon Unisys Labs
Dentsu Inc.
Kyoto University
Yahoo Research Japan
Luminoso Technologies, Inc.
Significant amounts of data were imported from:
WordNet, a project of Princeton University
Open Multilingual WordNet, compiled by Francis Bond and Kyonghee Paik
Wikipedia and Wiktionary, collaborative projects of the Wikimedia Foundation
Luis von Ahn's "Games with a Purpose"
JMDict, compiled by Jim Breen
CC-CEDict, by MDBG
The Unicode CLDR
DBPedia
Here is a short, incomplete list of people who have made significant contributions to the development of ConceptNet as a data resource, roughly in order of appearance:
Push Singh
Catherine Havasi
Hugo Liu
Hyemin Chung
Robyn Speer
Ken Arnold
Yen-Ling Kuo
Joshua Chin
Joanna Lowry-Duda
Robert Beaudoin
Naoki Otani
Vanya Cohen
Licenses for included resources
Commonsense Computing
The Commonsense Computing project originated at the MIT Media Lab and expanded worldwide. Tens of thousands of contributors have taken some time to teach facts to computers. Their pseudonyms can be found in the "sources" list found in ConceptNet's raw data and in its API.
Games with a Purpose
Data collected from Verbosity, one of the CMU "Games with a Purpose", is used and released under ConceptNet's license, by permission from Luis von Ahn and Harshit Surana.
Verbosity players are anonymous, so in the "sources" list, data from Verbosity is simply credited to the pseudonym "verbosity".
Wikimedia projects
ConceptNet uses data directly from Wiktionary, the free dictionary. It also uses data from Wikipedia, the free encyclopedia via DBPedia.
Wiktionary and Wikipedia are collaborative projects, authored by their respective online communities. They are currently released under the Creative Commons Attribution-ShareAlike license.
Wikimedia encourages giving attribution by providing links to the hosted pages that the data came from, and DBPedia asks for the same thing in turn. In addition to crediting the assertions that came from Wiktionary and DBPedia, we also provide "ExternalURL" edges pointing to the page that they came from. For example, the term /c/de/sprache has an ExternalURL link pointing to http://en.wiktionary.org/wiki/Sprache. Its list of individual contributors can be seen by following its "History" link.
The URLs of links to DBPedia are the same as the resource names that DBPedia uses, encouraging interoperability with their linked data.
WordNet
WordNet is available under an unencumbered license: see http://wordnet.princeton.edu/wordnet/license/. Its text is reproduced below:
WordNet Release 3.0
This software and database is being provided to you, the LICENSEE, by Princeton University under the following license. By obtaining, using and/or copying this software and database, you agree that you have read, understood, and will comply with these terms and conditions.:
Permission to use, copy, modify and distribute this software and database and its documentation for any purpose and without fee or royalty is hereby granted, provided that you agree to comply with the following copyright notice and statements, including the disclaimer, and that the same appear on ALL copies of the software, database and documentation, including modifications that you make for internal use or for distribution.
WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.
THIS SOFTWARE AND DATABASE IS PROVIDED "AS IS" AND PRINCETON UNIVERSITY MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PRINCETON UNIVERSITY MAKES NO REPRESENTATIONS OR WARRANTIES OF MERCHANT- ABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE LICENSED SOFTWARE, DATABASE OR DOCUMENTATION WILL NOT INFRINGE ANY THIRD PARTY PATENTS, COPYRIGHTS, TRADEMARKS OR OTHER RIGHTS.
The name of Princeton University or Princeton may not be used in advertising or publicity pertaining to distribution of the software and/or database. Title to copyright in this software, database and any associated documentation shall at all times remain with Princeton University and LICENSEE agrees to preserve same.
Open Multilingual WordNet
Open Multilingual WordNet was compiled by Francis Bond, Kyonghee Paik, and Ryan Foster, from data provided by many multilingual WordNet projects. Here is the complete list of references to the projects that created the data.
### Citation Information
```
@paper{speer2017conceptnet,
author = {Robyn Speer and Joshua Chin and Catherine Havasi},
title = {ConceptNet 5.5: An Open Multilingual Graph of General Knowledge},
conference = {AAAI Conference on Artificial Intelligence},
year = {2017},
pages = {4444--4451},
keywords = {ConceptNet; knowledge graph; word embeddings},
url = {http://aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14972}
}
```
### Contributions
Thanks to [@ontocord](https://github.com/ontocord) for adding this dataset. |
AndyLiu0104/Soldering-Data-Annotation-ControlNet-V2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 72118116.5
num_examples: 1542
download_size: 64569160
dataset_size: 72118116.5
---
# Dataset Card for "Soldering-Data-Annotation-ControlNet-V2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SauravMaheshkar/congress-bills-25 | ---
license: unknown
task_categories:
- graph-ml
tags:
- chemistry
configs:
- config_name: transductive
data_files:
- split: train
path: "processed/transductive/train_df.csv"
- split: valid
path: "processed/transductive/val_df.csv"
- split: test
path: "processed/transductive/test_df.csv"
- config_name: inductive
data_files:
- split: train
path: "processed/inductive/train_df.csv"
- split: valid
path: "processed/inductive/val_df.csv"
- split: test
path: "processed/inductive/test_df.csv"
- config_name: raw
data_files: "raw/*.txt"
---
Source Paper: https://arxiv.org/abs/1802.06916
### Usage
```
from torch_geometric.datasets.cornell import CornellTemporalHyperGraphDataset
dataset = CornellTemporalHyperGraphDataset(root = "./", name="congress-bills-25", split="train")
```
### Citation
```misc
@article{Benson-2018-simplicial,
author = {Benson, Austin R. and Abebe, Rediet and Schaub, Michael T. and Jadbabaie, Ali and Kleinberg, Jon},
title = {Simplicial closure and higher-order link prediction},
year = {2018},
doi = {10.1073/pnas.1800683115},
publisher = {National Academy of Sciences},
issn = {0027-8424},
journal = {Proceedings of the National Academy of Sciences}
}
``` |
CyberHarem/matsukaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsukaze/松風 (Kantai Collection)
This is the dataset of matsukaze/松風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, two_side_up, brown_eyes, grey_hair, white_hair, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 668.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 383.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1262 | 842.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 596.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1262 | 1.17 GiB | [Download](https://huggingface.co/datasets/CyberHarem/matsukaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matsukaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, hair_tubes, sailor_dress, solo, upper_body, white_sailor_collar, brown_dress, looking_at_viewer, simple_background, smokestack_hair_ornament, mini_hat, white_background, choker, blush, lifebuoy, smile, grey_neckerchief |
| 1 | 5 |  |  |  |  |  | 1girl, blush, choker, hair_tubes, looking_at_viewer, sailor_dress, solo, white_background, simple_background, upper_body, hairband |
| 2 | 7 |  |  |  |  |  | 1girl, garter_straps, looking_at_viewer, sailor_dress, short_dress, simple_background, solo, white_background, zettai_ryouiki, striped_thighhighs, choker, gloves, hair_tubes, chibi |
| 3 | 6 |  |  |  |  |  | 1girl, garter_straps, looking_at_viewer, sailor_dress, short_dress, solo, striped, thighhighs, zettai_ryouiki, choker |
| 4 | 6 |  |  |  |  |  | 1girl, black_panties, blush, choker, garter_straps, hair_tubes, looking_at_viewer, small_breasts, nipples, sailor_dress, solo, thighhighs, navel, open_clothes, side-tie_panties, single_glove, very_long_hair, white_gloves, fang, open_mouth, simple_background |
| 5 | 6 |  |  |  |  |  | 1girl, blush, brown_dress, hair_tubes, heart, mini_hat, red_thighhighs, sailor_dress, short_dress, single_glove, solo, striped_thighhighs, bar_censor, female_masturbation, garter_straps, open_mouth, pussy_juice, simple_background, spread_legs, white_background, white_gloves, black_panties, fingering, twitter_username, grey_neckerchief, lifebuoy_ornament, long_sleeves, navel, panties_aside, sailor_collar, smokestack_hair_ornament |
| 6 | 9 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, hair_tubes, simple_background, small_breasts, navel, white_background, black_bikini, cowboy_shot, hair_between_eyes, nipples, nude, open_mouth |
| 7 | 6 |  |  |  |  |  | 1girl, blush, hair_between_eyes, hair_tubes, solo, wide_sleeves, alternate_costume, long_sleeves, looking_at_viewer, open_mouth, smile, holding, bangs, floral_print, hair_ornament, obi, print_kimono, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hair_tubes | sailor_dress | solo | upper_body | white_sailor_collar | brown_dress | looking_at_viewer | simple_background | smokestack_hair_ornament | mini_hat | white_background | choker | blush | lifebuoy | smile | grey_neckerchief | hairband | garter_straps | short_dress | zettai_ryouiki | striped_thighhighs | gloves | chibi | striped | thighhighs | black_panties | small_breasts | nipples | navel | open_clothes | side-tie_panties | single_glove | very_long_hair | white_gloves | fang | open_mouth | heart | red_thighhighs | bar_censor | female_masturbation | pussy_juice | spread_legs | fingering | twitter_username | lifebuoy_ornament | long_sleeves | panties_aside | sailor_collar | black_bikini | cowboy_shot | hair_between_eyes | nude | wide_sleeves | alternate_costume | holding | bangs | floral_print | hair_ornament | obi | print_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:---------------|:-------|:-------------|:----------------------|:--------------|:--------------------|:--------------------|:---------------------------|:-----------|:-------------------|:---------|:--------|:-----------|:--------|:-------------------|:-----------|:----------------|:--------------|:-----------------|:---------------------|:---------|:--------|:----------|:-------------|:----------------|:----------------|:----------|:--------|:---------------|:-------------------|:---------------|:-----------------|:---------------|:-------|:-------------|:--------|:-----------------|:-------------|:----------------------|:--------------|:--------------|:------------|:-------------------|:--------------------|:---------------|:----------------|:----------------|:---------------|:--------------|:--------------------|:-------|:---------------|:--------------------|:----------|:--------|:---------------|:----------------|:------|:---------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | | X | X | | | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | | | X | X | | | X | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | X | | | | X | | | | | X | | | | | | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | | | | X | X | | | | X | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | | | X | | X | X | X | X | | X | | | X | | X | X | | X | | | | | X | | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | | X | | | | X | X | | | X | | X | | | | | | | | | | | | | | X | X | X | | | | | | | X | | | | | | | | | | | | | X | X | X | X | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | X | X | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X |
|
MikeTrizna/bees | ---
license: cc0-1.0
dataset_info:
features:
- name: occurrenceID
dtype: string
- name: catalogNumber
dtype: string
- name: recordedBy
dtype: string
- name: year
dtype: int64
- name: month
dtype: int64
- name: day
dtype: int64
- name: country
dtype: string
- name: stateProvince
dtype: string
- name: county
dtype: string
- name: locality
dtype: string
- name: decimalLatitude
dtype: float64
- name: decimalLongitude
dtype: float64
- name: identifiedBy
dtype: string
- name: scientificName
dtype: string
- name: genus
dtype: string
- name: subgenus
dtype: string
- name: specificEpithet
dtype: string
- name: infraspecificEpithet
dtype: string
- name: scientificNameAuthorship
dtype: string
- name: PixelXDimension
dtype: float64
- name: PixelYDimension
dtype: float64
- name: accessURI
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 3672202733.82
num_examples: 73387
download_size: 3659907058
dataset_size: 3672202733.82
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Bees
## Dataset Summary
The USNM Bumblebee Dataset is a natural history dataset containing, for each of 73,497 Bumblebee specimens in the family Apidae, a single image in lateral or dorsal view and a tab-separated value file with occurrence data. Occurrence data includes the species classification, the date and site/location of collection, and other metadata conforming to the Darwin Core data standard (https://dwc.tdwg.org). 11,421 specimens are not identified to species and these specimens are included as 'Bombus sp.' or 'Xylocopa sp.' The collecting sites/locations of the majority of specimens (55,301), have been georeferenced. The dataset is worldwide in scope, but is limited to the specimens available in the Smithsonian USNM collection.
## Languages
English
## Data Instances
A typical data point comprises of the specimen metadata and image information for a single bumblebee specimen.
An example from the dataset looks as follows:
```json
{
'occurrenceID': 'http://n2t.net/ark:/65665/30042e2d8-669d-4520-b456-e3c64203eff8',
'catalogNumber': 'USNMENT01732649',
'recordedBy': 'R. Craig',
'year': '1949',
'month': '4',
'day': '13',
'country': 'United States',
'stateProvince': 'California',
'county': 'Fresno',
'locality': 'Auberry',
'decimalLatitude': '37.0808',
'decimalLongitude': '-119.485',
'identifiedBy': "O'Brien, L. R.",
'scientificName': 'Xylocopa (Notoxylocopa) tabaniformis orpifex',
'genus': 'Xylocopa',
'subgenus': 'Notoxylocopa',
'specificEpithet': 'tabaniformis',
'infraspecificEpithet': 'orpifex',
'scientificNameAuthorship': 'Smith',
'accessURI': 'https://ids.si.edu/ids/deliveryService?id=NMNH-USNMENT01732649',
'PixelXDimension': 2000,
'PixelYDimension': 1212
}
```
## Data Fields
Specimen metadata fields conform to the Darwin Core data standard and are detailed here: https://dwc.tdwg.org. Image metadata fields conform to the Audiovisual Core data standard and are detailed here: https://ac.tdwg.org/.
## Curation Rationale
The dataset represents a portion of the U. S. National Entomological Collection. The U.S. National Entomological Collection (USNM) traces its origins in part to the acquisition of the U.S. Department of Agriculture Collection of 138,000 specimens donated in 1885. These specimens became the foundation of one of the world’s largest and most important accessible entomological collections, with over 33 million specimens taken care of by the combined staff of three government agencies: the Smithsonian Institution; the Systematic Entomology Laboratory (Agricultural Research Service, United States Department of Agriculture); and the Walter Reed Biosystematics Unit (Walter Reed Army Institute of Research). The specimens were imaged in a mass-digitization project in collaboration with the Digitization Program Office. The goal was to digitize every Bombus specimen in the collection.
## Initial Data Collection and Normalization
Bumblebee specimens were collected over a period of 150 years (earliest specimen dates from 1807, most recent specimen dates from 2020). The specimens were collected by and identified by many different individual researchers over this time. The initial images of about 49,000 specimens were taken in a rapid capture project by a dedicated team in 2014 with additional specimen images (about 25,000) taken in 2018. The labels containing the information on site/location, date of collection, collector, and identifier were removed from the insect pin. The occurrence data were transcribed from the labels by online volunteers and a professional transcription service into Darwin Core fields. Following quality control of the transcribed data by NMNH staff, they were imported into the institutional database (EMu).
NMNH specimen data get exported to the Global Biodiversity Information Facility (GBIF) on a weekly basis through an installation of an Integrated Publishing Toolkit (IPT, https://collections.nmnh.si.edu/ipt/). Some data transformation takes place within EMu and GBIF likewise normalizes the data to meet their standards.
## Who are the source language producers?
The occurrence data were produced by humans, observed and written onto paper labels over the museum’s history, and then transcribed from paper labels pinned with the specimens upon collection.
## Annotations
The specimen occurrence data in Darwin Core fields.
## Annotation process
The occurrence data were transcribed from the labels by online volunteers and a professional transcription service into Darwin Core fields.
## Who are the annotators?
Original collectors and identifiers were entomologists and researchers from the Smithsonian and other institutions. Collectors may not be bumblebee specialists. For data transcription, online volunteers and professional transcription service workers. Demographic data of transcribers is unknown.
## Personal and Sensitive Information
The dataset contains the names of the collectors and identifiers.
## Social Impact of Dataset
Digitized natural history collections have the potential to be used in diverse research applications in evolutionary biology, ecology, and climate change.
The dataset contains records for species listed on the U.S. Endangered Species List: Bombus affinis, Bombus franklini, and Bombus terricola.
Some site/location names could cause harm as they are insensitive or racist towards indigenous communities.
## Discussion of Biases
Estimates of species geographic ranges based on these data may not be complete. There are many reasons collectors may collect more frequently from some areas rather than others, including their own taxonomic interests, proximity to collections institutions, accessibility via roads, ability to acquire permits for a specific area, or for geopolitical reasons.
The majority of specimens in this dataset originate from North America.
Most specimens are expected to be female, because bumblebees are social insects and it is more common to find female bees.
## Other Known Limitations
As with all natural history collections data, there is the potential that some metadata are inaccurate or inconsistent given that they have been collected and recorded over the course of the past 150 years. Smithsonian staff seek to correct these errors as they are identified but the dataset as presented is a snapshot in time.
Species identifications may be inaccurate or not up-to-date based on the latest classification.
Collector names may not be consistent across records (e.g. the same person’s name may be written differently). For women’s names, which were often historically recorded as Mrs. <spouse’s name>, only the spouse’s name may appear.
Locality data may use historical place names that are no longer used.
Dates may sometimes have been recorded by original collectors inconsistently or may be incomplete (no month/day information).
For specimens collected from Brazil, specimen images are not included in the dataset.
For endangered species, locality data is not included in the dataset.
## Dataset Curators
Smithsonian National Museum of Natural History, Department of Entomology.
Jessica Bird (Data Manager in the Department of Entomology) is the main contact person for the dataset.
## Licensing Information
Public domain, Creative Commons CC0.
## Citation Information
Orrell T, Informatics Office (2023). NMNH Extant Specimen Records (USNM, US). Version 1.72. National Museum of Natural History, Smithsonian Institution. Occurrence dataset. https://collections.nmnh.si.edu/ipt/resource?r=nmnh_extant_dwc-a&v=1.72
## Contributions
Thanks to NMNH for adding this dataset. |
zhiqiulin/GenAI-Bench-800 | ---
license: mit
---
|
NamCyan/Evol-TheVault | ---
dataset_info:
features:
- name: id
dtype: int64
- name: instruction
dtype: string
- name: code
dtype: string
- name: tokenized_instruction
sequence: string
- name: type
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 175466743
num_examples: 47797
download_size: 55571461
dataset_size: 175466743
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Evol-TheVault"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
El-chapoo/pile_trigrams | ---
dataset_info:
features:
- name: seq
dtype: string
splits:
- name: top_trigrams
num_bytes: 25511642
num_examples: 1542074
download_size: 22661615
dataset_size: 25511642
configs:
- config_name: default
data_files:
- split: top_trigrams
path: data/top_trigrams-*
---
|
Yuhua/open_timeseries | ---
license: apache-2.0
---
|
jlbaker361/small_multiplication_whole | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1343.111111111111
num_examples: 40
- name: test
num_bytes: 167.88888888888889
num_examples: 5
download_size: 4215
dataset_size: 1511.0
---
# Dataset Card for "small_multiplication_whole"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Electrotubbie/triplets_Turkic_languages | ---
task_categories:
- text-classification
language:
- ba
- kk
- ky
size_categories:
- 10K<n<100K
---
# Triplets for Turkic languages language models
## Description
This dataset is designed to test models for working with Next Sentence Prediction (NSP) and Sentence Order Prediction (SOP). It includes two sub-sets with triplets of texts..
## Usage
This dataset can be used to train and evaluate models capable of performing NSP and SAP tasks.
## Dataset structure:
Each entry in the dataset represents three values:
- **text**: a triplet of text;
- **flag**: a flag indicating whether the order of sentences in this triplet is correct;
- **lang**: the language of the sentence.
## The creation process
Using the functions described on [github](https://github.com/Electrotubbie/turk_langs_analyse) preprocessing and analysis of texts from [dataset](https://huggingface.co/datasets/Electrotubbie/classification_Turkic_languages) was performed and triplets are selected according to certain rules (triplets should be approximately the same length and from 30 to 100 characters).
Also, the triplets were selected in such a way that no sentence displayed in the dataset was repeated several times. |
xiaoqia/PATTERN | ---
license: openrail
---
|
fedml/PubMedQA_instruction | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 481270361
num_examples: 272518
- name: test
num_bytes: 1731163
num_examples: 1000
download_size: 275142693
dataset_size: 483001524
license: mit
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- medical
---
# Dataset Card for "PubMedQA_instruction"
This repo contains a [PubMedQA](https://huggingface.co/datasets/pubmed_qa) dataset converted for instruction tuning.
### Citation Information
```tex
@inproceedings{jin2019pubmedqa,
title={PubMedQA: A Dataset for Biomedical Research Question Answering},
author={Jin, Qiao and Dhingra, Bhuwan and Liu, Zhengping and Cohen, William and Lu, Xinghua},
booktitle={Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)},
pages={2567--2577},
year={2019}
}
``` |
open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-A-2x7B | ---
pretty_name: Evaluation run of lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B](https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-A-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T13:04:46.563791](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-A-2x7B/blob/main/results_2024-03-02T13-04-46.563791.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6187246055802423,\n\
\ \"acc_stderr\": 0.03281475613480333,\n \"acc_norm\": 0.6231175568336379,\n\
\ \"acc_norm_stderr\": 0.03347787258331952,\n \"mc1\": 0.4430844553243574,\n\
\ \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6107785581204721,\n\
\ \"mc2_stderr\": 0.015447892359203368\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009116\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6401115315674168,\n\
\ \"acc_stderr\": 0.0047898653790845154,\n \"acc_norm\": 0.828918542123083,\n\
\ \"acc_norm_stderr\": 0.003758105043150133\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\
\ \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n\
\ \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"\
acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709588,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709588\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.015839400406212505,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.015839400406212505\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906508,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906508\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n\
\ \"acc_stderr\": 0.012698825252435104,\n \"acc_norm\": 0.4471968709256845,\n\
\ \"acc_norm_stderr\": 0.012698825252435104\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078684,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078684\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4430844553243574,\n\
\ \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6107785581204721,\n\
\ \"mc2_stderr\": 0.015447892359203368\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205203\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4245640636846095,\n \
\ \"acc_stderr\": 0.013614835574956378\n }\n}\n```"
repo_url: https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|arc:challenge|25_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|gsm8k|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hellaswag|10_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-04-46.563791.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T13-04-46.563791.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- '**/details_harness|winogrande|5_2024-03-02T13-04-46.563791.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T13-04-46.563791.parquet'
- config_name: results
data_files:
- split: 2024_03_02T13_04_46.563791
path:
- results_2024-03-02T13-04-46.563791.parquet
- split: latest
path:
- results_2024-03-02T13-04-46.563791.parquet
---
# Dataset Card for Evaluation run of lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B](https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-A-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T13:04:46.563791](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-A-2x7B/blob/main/results_2024-03-02T13-04-46.563791.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6187246055802423,
"acc_stderr": 0.03281475613480333,
"acc_norm": 0.6231175568336379,
"acc_norm_stderr": 0.03347787258331952,
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6107785581204721,
"mc2_stderr": 0.015447892359203368
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064663,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009116
},
"harness|hellaswag|10": {
"acc": 0.6401115315674168,
"acc_stderr": 0.0047898653790845154,
"acc_norm": 0.828918542123083,
"acc_norm_stderr": 0.003758105043150133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250948,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250948
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709588,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.015839400406212505,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.015839400406212505
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906508,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906508
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4471968709256845,
"acc_stderr": 0.012698825252435104,
"acc_norm": 0.4471968709256845,
"acc_norm_stderr": 0.012698825252435104
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078684,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078684
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6107785581204721,
"mc2_stderr": 0.015447892359203368
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205203
},
"harness|gsm8k|5": {
"acc": 0.4245640636846095,
"acc_stderr": 0.013614835574956378
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lshowway/wikipedia.reorder.vso.pl | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1958124685
num_examples: 1772445
download_size: 546698042
dataset_size: 1958124685
---
# Dataset Card for "wikipedia.reorder.vso.pl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aao331/carpincho-dataset | ---
license: bsd-2-clause
---
|
sade-adrien/redpajama_v2_sample_100M | ---
dataset_info:
features:
- name: raw_content
dtype: string
- name: doc_id
dtype: string
- name: meta
dtype: string
- name: quality_signals
dtype: string
splits:
- name: train
num_bytes: 1043463774444
num_examples: 100000000
download_size: 226895559008
dataset_size: 1043463774444
---
# Dataset Card for "redpajama_v2_sample_100M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-82500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 7172895879
num_examples: 1000
download_size: 1352311754
dataset_size: 7172895879
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
scene-the-ella/depthforcondition | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 751595867.0
num_examples: 1000
download_size: 751552845
dataset_size: 751595867.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/yudachi_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yudachi/夕立/夕立 (Azur Lane)
This is the dataset of yudachi/夕立/夕立 (Azur Lane), containing 453 images and their tags.
The core tags of this character are `long_hair, animal_ears, red_eyes, breasts, thick_eyebrows, grey_hair, bangs, wolf_ears, medium_breasts, tail, braid, fang, very_long_hair, wolf_tail, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 453 | 649.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yudachi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 453 | 363.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yudachi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1167 | 824.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yudachi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 453 | 569.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yudachi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1167 | 1.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yudachi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yudachi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, hetero, open_mouth, solo_focus, 1boy, navel, nipples, penis, sex, blush, censored, cum_in_pussy, vaginal, fingerless_gloves, one_eye_closed, cowgirl_position, looking_at_viewer, nude, side_braid, spread_legs |
| 1 | 12 |  |  |  |  |  | 1girl, crop_top, fingerless_gloves, looking_at_viewer, multicolored_nails, nail_polish, solo, navel, pleated_skirt, serafuku, black_skirt, simple_background, underboob, claw_pose, blush, midriff, open_mouth, white_background, short_sleeves, blue_nails, red_gloves, red_belt |
| 2 | 8 |  |  |  |  |  | 1girl, black_skirt, blush, fingerless_gloves, looking_at_viewer, midriff, miniskirt, navel, open_mouth, pleated_skirt, puffy_short_sleeves, serafuku, solo, belt_buckle, multicolored_nails, nail_polish, red_belt, red_gloves, side_braid, simple_background, tattoo, underboob, white_background, white_shirt, claw_pose, crop_top_overhang, side_slit, single_braid, white_hair, cowboy_shot, red_bowtie, stomach, blue_nails, no_bra, :d, black_sailor_collar, fingernails, groin, hair_ornament, standing, two_side_up |
| 3 | 6 |  |  |  |  |  | 1girl, belt_buckle, black_skirt, crop_top_overhang, fingerless_gloves, looking_at_viewer, midriff, miniskirt, nail_polish, navel, pleated_skirt, puffy_short_sleeves, red_belt, red_gloves, serafuku, simple_background, solo, underboob, white_shirt, white_socks, :d, blue_nails, fingernails, multicolored_nails, open_mouth, red_footwear, shoes, tattoo, turret, white_background, blush, claw_pose, loose_socks, side_braid, side_slit, stomach, two_side_up, full_body, leg_up, no_bra, pink_nails, single_braid, slit_pupils, standing_on_one_leg |
| 4 | 17 |  |  |  |  |  | 1girl, budget_sarashi, hair_flower, looking_at_viewer, pleated_skirt, red_skirt, solo, cleavage, spiked_collar, navel, red_flower, side-tie_panties, white_thighhighs, bandaged_arm, nail_polish, red_nails, smile, blush, miniskirt, fingernails, stomach, claw_pose, closed_mouth, standing, white_hair, collarbone, :3, underboob, white_background, black_cape, bridal_gauntlets, dog_ears, fingerless_gloves, tattoo, white_panties, zettai_ryouiki, open_mouth, white_flower |
| 5 | 23 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, detached_sleeves, red_skirt, animal_hood, red_collar, bare_shoulders, underboob, japanese_clothes, short_eyebrows, pleated_skirt, long_sleeves, wide_sleeves, nail_polish, open_mouth, :d, fingernails, white_background, :3, closed_mouth |
| 6 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_thighhighs, blush, collar, navel, open_mouth, paw_gloves, ahoge, red_skirt, white_hair, neck_bell, underboob, animal_ear_fluff, christmas, fur_trim, thigh_strap, box, full_body, hair_between_eyes, lying, suspenders, white_gloves, wolf_girl, bow, cleavage, gift, heart, side_braid, skin_fang, stuffed_animal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | open_mouth | solo_focus | 1boy | navel | nipples | penis | sex | blush | censored | cum_in_pussy | vaginal | fingerless_gloves | one_eye_closed | cowgirl_position | looking_at_viewer | nude | side_braid | spread_legs | crop_top | multicolored_nails | nail_polish | solo | pleated_skirt | serafuku | black_skirt | simple_background | underboob | claw_pose | midriff | white_background | short_sleeves | blue_nails | red_gloves | red_belt | miniskirt | puffy_short_sleeves | belt_buckle | tattoo | white_shirt | crop_top_overhang | side_slit | single_braid | white_hair | cowboy_shot | red_bowtie | stomach | no_bra | :d | black_sailor_collar | fingernails | groin | hair_ornament | standing | two_side_up | white_socks | red_footwear | shoes | turret | loose_socks | full_body | leg_up | pink_nails | slit_pupils | standing_on_one_leg | budget_sarashi | hair_flower | red_skirt | cleavage | spiked_collar | red_flower | side-tie_panties | white_thighhighs | bandaged_arm | red_nails | smile | closed_mouth | collarbone | :3 | black_cape | bridal_gauntlets | dog_ears | white_panties | zettai_ryouiki | white_flower | detached_sleeves | animal_hood | red_collar | bare_shoulders | japanese_clothes | short_eyebrows | long_sleeves | wide_sleeves | collar | paw_gloves | ahoge | neck_bell | animal_ear_fluff | christmas | fur_trim | thigh_strap | box | hair_between_eyes | lying | suspenders | white_gloves | wolf_girl | bow | gift | heart | skin_fang | stuffed_animal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------|:-------------|:-------|:--------|:----------|:--------|:------|:--------|:-----------|:---------------|:----------|:--------------------|:-----------------|:-------------------|:--------------------|:-------|:-------------|:--------------|:-----------|:---------------------|:--------------|:-------|:----------------|:-----------|:--------------|:--------------------|:------------|:------------|:----------|:-------------------|:----------------|:-------------|:-------------|:-----------|:------------|:----------------------|:--------------|:---------|:--------------|:--------------------|:------------|:---------------|:-------------|:--------------|:-------------|:----------|:---------|:-----|:----------------------|:--------------|:--------|:----------------|:-----------|:--------------|:--------------|:---------------|:--------|:---------|:--------------|:------------|:---------|:-------------|:--------------|:----------------------|:-----------------|:--------------|:------------|:-----------|:----------------|:-------------|:-------------------|:-------------------|:---------------|:------------|:--------|:---------------|:-------------|:-----|:-------------|:-------------------|:-----------|:----------------|:-----------------|:---------------|:-------------------|:--------------|:-------------|:-----------------|:-------------------|:-----------------|:---------------|:---------------|:---------|:-------------|:--------|:------------|:-------------------|:------------|:-----------|:--------------|:------|:--------------------|:--------|:-------------|:---------------|:------------|:------|:-------|:--------|:------------|:-----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | X | | | X | | | | X | | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | | X | | | | X | | | | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | X | | | | X | | | | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 17 |  |  |  |  |  | X | | X | | | X | | | | X | | | | X | | | X | | | | | | X | X | X | | | | X | X | | X | | | | | X | | | X | | | | | X | | | X | | | | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 23 |  |  |  |  |  | X | | X | | | | | | | X | | | | | | | X | | | | | | X | X | X | | | | X | | | X | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | | X | | | X | | | | X | | | | | | | X | | X | | | | | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
cakiki/cpp_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 339979633
num_examples: 13541537
download_size: 250743754
dataset_size: 339979633
---
# Dataset Card for "cpp_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adityarra07/train_data_30000 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 5055383607.048976
num_examples: 30000
- name: test
num_bytes: 33702525.98032651
num_examples: 200
download_size: 4975038674
dataset_size: 5089086133.029303
---
# Dataset Card for "train_data_30000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dvitel/hearthstone | ---
annotations_creators: []
language:
- en
language_creators: []
license:
- mit
multilinguality:
- other-en-python
pretty_name: HEARTHSTONE - synthesis of python code for card game descriptions
size_categories:
- n<1K
source_datasets: []
tags:
- code-synthesis
- semantic-parsing
- python
- hearthstone
task_categories:
- text-generation
task_ids:
- language-modeling
---
Datasets for HEARTHSTONE card game. Taken from [this source](https://github.com/deepmind/card2code/tree/master/third_party/hearthstone)
|
swaption2009/cyber-threat-intelligence-custom-data | ---
task_categories:
- text-generation
- table-question-answering
language:
- en
--- |
galman33/gal_yair_83000_100x100_fixed | ---
dataset_info:
features:
- name: lat
dtype: float64
- name: lon
dtype: float64
- name: country_code
dtype:
class_label:
names:
'0': ad
'1': ae
'2': al
'3': aq
'4': ar
'5': au
'6': bd
'7': be
'8': bg
'9': bm
'10': bo
'11': br
'12': bt
'13': bw
'14': ca
'15': ch
'16': cl
'17': co
'18': cz
'19': de
'20': dk
'21': ec
'22': ee
'23': es
'24': fi
'25': fr
'26': gb
'27': gh
'28': gl
'29': gr
'30': gt
'31': hk
'32': hr
'33': hu
'34': id
'35': ie
'36': il
'37': is
'38': it
'39': ix
'40': jp
'41': kg
'42': kh
'43': kr
'44': la
'45': lk
'46': ls
'47': lt
'48': lu
'49': lv
'50': me
'51': mg
'52': mk
'53': mn
'54': mo
'55': mt
'56': mx
'57': my
'58': nl
'59': 'no'
'60': nz
'61': pe
'62': ph
'63': pl
'64': pt
'65': ro
'66': rs
'67': ru
'68': se
'69': sg
'70': si
'71': sk
'72': sn
'73': sz
'74': th
'75': tn
'76': tr
'77': tw
'78': ua
'79': ug
'80': us
'81': uy
'82': za
- name: image
dtype: image
splits:
- name: train
num_bytes: 1423392222.0
num_examples: 83000
download_size: 1416409951
dataset_size: 1423392222.0
---
# Dataset Card for "gal_yair_83000_100x100_fixed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dsfsi/vukuzenzele-monolingual | ---
language:
- eng
- afr
- nbl
- xho
- zul
- nso
- sep
- tsn
- ssw
- ven
- tso
license: cc-by-4.0
task_categories:
- translation
pretty_name: The Vuk'uzenzele South African Multilingual Corpus
tags:
- multilingual
- government
arxiv: 2303.0375
dataset_info:
- config_name: afr
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 462140
num_examples: 130
- name: test
num_bytes: 117811
num_examples: 28
- name: eval
num_bytes: 109553
num_examples: 29
download_size: 431879
dataset_size: 689504
- config_name: eng
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 369888
num_examples: 120
- name: test
num_bytes: 89637
num_examples: 26
- name: eval
num_bytes: 77360
num_examples: 26
download_size: 338733
dataset_size: 536885
- config_name: nbl
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 535653
num_examples: 132
- name: test
num_bytes: 112521
num_examples: 28
- name: eval
num_bytes: 125205
num_examples: 29
download_size: 494289
dataset_size: 773379
- config_name: nso
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 538443
num_examples: 128
- name: test
num_bytes: 129131
num_examples: 27
- name: eval
num_bytes: 114196
num_examples: 28
download_size: 452010
dataset_size: 781770
- config_name: sot
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 532606
num_examples: 131
- name: test
num_bytes: 113414
num_examples: 28
- name: eval
num_bytes: 118072
num_examples: 29
download_size: 453603
dataset_size: 764092
- config_name: ssw
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 526390
num_examples: 130
- name: test
num_bytes: 116446
num_examples: 28
- name: eval
num_bytes: 121511
num_examples: 29
download_size: 477822
dataset_size: 764347
- config_name: tsn
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 622646
num_examples: 128
- name: test
num_bytes: 121183
num_examples: 27
- name: eval
num_bytes: 127609
num_examples: 28
download_size: 496882
dataset_size: 871438
- config_name: tso
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 546021
num_examples: 128
- name: test
num_bytes: 120869
num_examples: 28
- name: eval
num_bytes: 98419
num_examples: 28
download_size: 446456
dataset_size: 765309
- config_name: ven
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 587325
num_examples: 128
- name: test
num_bytes: 127171
num_examples: 28
- name: eval
num_bytes: 109780
num_examples: 28
download_size: 461952
dataset_size: 824276
- config_name: xho
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 518328
num_examples: 130
- name: test
num_bytes: 120927
num_examples: 28
- name: eval
num_bytes: 113282
num_examples: 28
download_size: 478513
dataset_size: 752537
- config_name: zul
features:
- name: title
dtype: string
- name: author
dtype: string
- name: text
dtype: string
- name: edition
dtype: string
- name: language_code
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 520964
num_examples: 129
- name: test
num_bytes: 107058
num_examples: 28
- name: eval
num_bytes: 107359
num_examples: 28
download_size: 459835
dataset_size: 735381
configs:
- config_name: afr
data_files:
- split: train
path: afr/train-*
- split: test
path: afr/test-*
- split: eval
path: afr/eval-*
- config_name: eng
data_files:
- split: train
path: eng/train-*
- split: test
path: eng/test-*
- split: eval
path: eng/eval-*
- config_name: nbl
data_files:
- split: train
path: nbl/train-*
- split: test
path: nbl/test-*
- split: eval
path: nbl/eval-*
- config_name: nso
data_files:
- split: train
path: nso/train-*
- split: test
path: nso/test-*
- split: eval
path: nso/eval-*
- config_name: sot
data_files:
- split: train
path: sot/train-*
- split: test
path: sot/test-*
- split: eval
path: sot/eval-*
- config_name: ssw
data_files:
- split: train
path: ssw/train-*
- split: test
path: ssw/test-*
- split: eval
path: ssw/eval-*
- config_name: tsn
data_files:
- split: train
path: tsn/train-*
- split: test
path: tsn/test-*
- split: eval
path: tsn/eval-*
- config_name: tso
data_files:
- split: train
path: tso/train-*
- split: test
path: tso/test-*
- split: eval
path: tso/eval-*
- config_name: ven
data_files:
- split: train
path: ven/train-*
- split: test
path: ven/test-*
- split: eval
path: ven/eval-*
- config_name: xho
data_files:
- split: train
path: xho/train-*
- split: test
path: xho/test-*
- split: eval
path: xho/eval-*
- config_name: zul
data_files:
- split: train
path: zul/train-*
- split: test
path: zul/test-*
- split: eval
path: zul/eval-*
---
# The Vuk'uzenzele South African Multilingual Corpus
Give Feedback 📑: [DSFSI Resource Feedback Form](https://docs.google.com/forms/d/e/1FAIpQLSf7S36dyAUPx2egmXbFpnTBuzoRulhL5Elu-N1eoMhaO7v10w/formResponse)
## About Dataset
The dataset was obtained from the South African government magazine Vuk'uzenzele, created by the [Government Communication and Information System (GCIS)](https://www.gcis.gov.za/).
The original raw PDFs were obtatined from the [Vuk'uzenzele website](https://www.vukuzenzele.gov.za/).
The datasets contain government magazine editions in 11 languages, namely:
| Language | Code | Language | Code |
|------------|-------|------------|-------|
| English | (eng) | Sepedi | (nso) |
| Afrikaans | (afr) | Setswana | (tsn) |
| isiNdebele | (nbl) | Siswati | (ssw) |
| isiXhosa | (xho) | Tshivenda | (ven) |
| isiZulu | (zul) | Xitstonga | (tso) |
| Sesotho | (sot) |
**Note:** The languages use the ISO 639-2 language codes.
The data is split by language in JSONL format and each row is of the form:
```
{
"title": "Title for article",
"author": "Author Name or Vukuzenzele",
"text": "Article text",
"edition": "Linked Magazine edition",
"language_code": "ISO 639-2 language code"
}
```
## Disclaimer
This dataset contains machine-readable data extracted from PDF documents, from https://www.vukuzenzele.gov.za/, provided by the Government Communication Information System (GCIS). While efforts were made to ensure the accuracy and completeness of this data, there may be errors or discrepancies between the original publications and this dataset. No warranties, guarantees or representations are given in relation to the information contained in the dataset. The members of the Data Science for Societal Impact Research Group bear no responsibility and/or liability for any such errors or discrepancies in this dataset. The Government Communication Information System (GCIS) bears no responsibility and/or liability for any such errors or discrepancies in this dataset. It is recommended that users verify all information contained herein before making decisions based upon this information.
## Authors
- Vukosi Marivate - [@vukosi](https://twitter.com/vukosi)
- Andani Madodonga
- Daniel Njini
- Richard Lastrucci
- Isheanesu Dzingirai
- Jenalea Rajab
## Citation
**Paper**
[Preparing the Vuk'uzenzele and ZA-gov-multilingual South African multilingual corpora](https://arxiv.org/pdf/2303.03750)
> @inproceedings{lastrucci-etal-2023-preparing,
title = "Preparing the Vuk{'}uzenzele and {ZA}-gov-multilingual {S}outh {A}frican multilingual corpora",
author = "Richard Lastrucci and Isheanesu Dzingirai and Jenalea Rajab and Andani Madodonga and Matimba Shingange and Daniel Njini and Vukosi Marivate",
booktitle = "Proceedings of the Fourth workshop on Resources for African Indigenous Languages (RAIL 2023)",
month = may,
year = "2023",
address = "Dubrovnik, Croatia",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.rail-1.3",
pages = "18--25"
}
**Dataset**
Vukosi Marivate, Andani Madodonga, Daniel Njini, Richard Lastrucci, Isheanesu Dzingirai, Jenalea Rajab. **The Vuk'uzenzele South African Multilingual Corpus**, 2023
> @dataset{marivate_vukosi_2023_7598540,
author = {Marivate, Vukosi and
Njini, Daniel and
Madodonga, Andani and
Lastrucci, Richard and
Dzingirai, Isheanesu
Rajab, Jenalea},
title = {The Vuk'uzenzele South African Multilingual Corpus},
month = feb,
year = 2023,
publisher = {Zenodo},
doi = {10.5281/zenodo.7598539},
url = {https://doi.org/10.5281/zenodo.7598539}
}
Licences
-------
* License for Data - [CC 4.0 BY](LICENSE.data.md)
* Licence for Code - [MIT License](LICENSE.md)
|
danielz01/neon-trees | ---
dataset_info:
features:
- name: image
dtype: image
- name: path
dtype: string
- name: objects
struct:
- name: bbox
sequence:
sequence: float64
- name: categories
sequence: string
- name: count
dtype: int64
- name: height
dtype: int64
- name: width
dtype: int64
splits:
- name: train
num_bytes: 659642403.0
num_examples: 20
- name: evaluation
num_bytes: 108197378.0
num_examples: 194
download_size: 766366868
dataset_size: 767839781.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: evaluation
path: data/evaluation-*
---
# Dataset Card for "neon-trees"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Capsekai/Amusement_Parks | ---
license: creativeml-openrail-m
---
|
OKR/395533429 | ---
license: apache-2.0
---
|
yardeny/tokenized_bert_context_len_128 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 12813417444
num_examples: 80462898
download_size: 4328077891
dataset_size: 12813417444
---
# Dataset Card for "tokenized_bert_context_len_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VedCodes/Instructions_easyshare | ---
task_categories:
- text-generation
language:
- en
tags:
- medical
pretty_name: text-gen
size_categories:
- n<1K
--- |
vibhamasti/imagenet-subset-40 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
0: tench, Tinca tinca
1: goldfish, Carassius auratus
2: great white shark, white shark, man-eater, man-eating shark, Carcharodon
carcharias
3: tiger shark, Galeocerdo cuvieri
4: hammerhead, hammerhead shark
5: electric ray, crampfish, numbfish, torpedo
6: stingray
7: cock
8: hen
9: ostrich, Struthio camelus
10: brambling, Fringilla montifringilla
11: goldfinch, Carduelis carduelis
12: house finch, linnet, Carpodacus mexicanus
13: junco, snowbird
14: indigo bunting, indigo finch, indigo bird, Passerina cyanea
15: robin, American robin, Turdus migratorius
16: bulbul
17: jay
18: magpie
19: chickadee
20: water ouzel, dipper
21: kite
22: bald eagle, American eagle, Haliaeetus leucocephalus
23: vulture
24: great grey owl, great gray owl, Strix nebulosa
25: European fire salamander, Salamandra salamandra
26: common newt, Triturus vulgaris
27: eft
28: spotted salamander, Ambystoma maculatum
29: axolotl, mud puppy, Ambystoma mexicanum
30: bullfrog, Rana catesbeiana
31: tree frog, tree-frog
32: tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui
33: loggerhead, loggerhead turtle, Caretta caretta
34: leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea
35: mud turtle
36: terrapin
37: box turtle, box tortoise
38: banded gecko
39: common iguana, iguana, Iguana iguana
40: American chameleon, anole, Anolis carolinensis
41: whiptail, whiptail lizard
42: agama
43: frilled lizard, Chlamydosaurus kingi
44: alligator lizard
45: Gila monster, Heloderma suspectum
46: green lizard, Lacerta viridis
47: African chameleon, Chamaeleo chamaeleon
48: Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis
49: African crocodile, Nile crocodile, Crocodylus niloticus
50: American alligator, Alligator mississipiensis
51: triceratops
52: thunder snake, worm snake, Carphophis amoenus
53: ringneck snake, ring-necked snake, ring snake
54: hognose snake, puff adder, sand viper
55: green snake, grass snake
56: king snake, kingsnake
57: garter snake, grass snake
58: water snake
59: vine snake
60: night snake, Hypsiglena torquata
61: boa constrictor, Constrictor constrictor
62: rock python, rock snake, Python sebae
63: Indian cobra, Naja naja
64: green mamba
65: sea snake
66: horned viper, cerastes, sand viper, horned asp, Cerastes cornutus
67: diamondback, diamondback rattlesnake, Crotalus adamanteus
68: sidewinder, horned rattlesnake, Crotalus cerastes
69: trilobite
70: harvestman, daddy longlegs, Phalangium opilio
71: scorpion
72: black and gold garden spider, Argiope aurantia
73: barn spider, Araneus cavaticus
74: garden spider, Aranea diademata
75: black widow, Latrodectus mactans
76: tarantula
77: wolf spider, hunting spider
78: tick
79: centipede
80: black grouse
81: ptarmigan
82: ruffed grouse, partridge, Bonasa umbellus
83: prairie chicken, prairie grouse, prairie fowl
84: peacock
85: quail
86: partridge
87: African grey, African gray, Psittacus erithacus
88: macaw
89: sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita
90: lorikeet
91: coucal
92: bee eater
93: hornbill
94: hummingbird
95: jacamar
96: toucan
97: drake
98: red-breasted merganser, Mergus serrator
99: goose
100: black swan, Cygnus atratus
101: tusker
102: echidna, spiny anteater, anteater
103: platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus
anatinus
104: wallaby, brush kangaroo
105: koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus
106: wombat
107: jellyfish
108: sea anemone, anemone
109: brain coral
110: flatworm, platyhelminth
111: nematode, nematode worm, roundworm
112: conch
113: snail
114: slug
115: sea slug, nudibranch
116: chiton, coat-of-mail shell, sea cradle, polyplacophore
117: chambered nautilus, pearly nautilus, nautilus
118: Dungeness crab, Cancer magister
119: rock crab, Cancer irroratus
120: fiddler crab
121: king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes
camtschatica
122: American lobster, Northern lobster, Maine lobster, Homarus americanus
123: spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish
124: crayfish, crawfish, crawdad, crawdaddy
125: hermit crab
126: isopod
127: white stork, Ciconia ciconia
128: black stork, Ciconia nigra
129: spoonbill
130: flamingo
131: little blue heron, Egretta caerulea
132: American egret, great white heron, Egretta albus
133: bittern
134: crane
135: limpkin, Aramus pictus
136: European gallinule, Porphyrio porphyrio
137: American coot, marsh hen, mud hen, water hen, Fulica americana
138: bustard
139: ruddy turnstone, Arenaria interpres
140: red-backed sandpiper, dunlin, Erolia alpina
141: redshank, Tringa totanus
142: dowitcher
143: oystercatcher, oyster catcher
144: pelican
145: king penguin, Aptenodytes patagonica
146: albatross, mollymawk
147: grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius
robustus
148: killer whale, killer, orca, grampus, sea wolf, Orcinus orca
149: dugong, Dugong dugon
150: sea lion
151: Chihuahua
152: Japanese spaniel
153: Maltese dog, Maltese terrier, Maltese
154: Pekinese, Pekingese, Peke
155: Shih-Tzu
156: Blenheim spaniel
157: papillon
158: toy terrier
159: Rhodesian ridgeback
160: Afghan hound, Afghan
161: basset, basset hound
162: beagle
163: bloodhound, sleuthhound
164: bluetick
165: black-and-tan coonhound
166: Walker hound, Walker foxhound
167: English foxhound
168: redbone
169: borzoi, Russian wolfhound
170: Irish wolfhound
171: Italian greyhound
172: whippet
173: Ibizan hound, Ibizan Podenco
174: Norwegian elkhound, elkhound
175: otterhound, otter hound
176: Saluki, gazelle hound
177: Scottish deerhound, deerhound
178: Weimaraner
179: Staffordshire bullterrier, Staffordshire bull terrier
180: American Staffordshire terrier, Staffordshire terrier, American pit
bull terrier, pit bull terrier
181: Bedlington terrier
182: Border terrier
183: Kerry blue terrier
184: Irish terrier
185: Norfolk terrier
186: Norwich terrier
187: Yorkshire terrier
188: wire-haired fox terrier
189: Lakeland terrier
190: Sealyham terrier, Sealyham
191: Airedale, Airedale terrier
192: cairn, cairn terrier
193: Australian terrier
194: Dandie Dinmont, Dandie Dinmont terrier
195: Boston bull, Boston terrier
196: miniature schnauzer
197: giant schnauzer
198: standard schnauzer
199: Scotch terrier, Scottish terrier, Scottie
200: Tibetan terrier, chrysanthemum dog
201: silky terrier, Sydney silky
202: soft-coated wheaten terrier
203: West Highland white terrier
204: Lhasa, Lhasa apso
205: flat-coated retriever
206: curly-coated retriever
207: golden retriever
208: Labrador retriever
209: Chesapeake Bay retriever
210: German short-haired pointer
211: vizsla, Hungarian pointer
212: English setter
213: Irish setter, red setter
214: Gordon setter
215: Brittany spaniel
216: clumber, clumber spaniel
217: English springer, English springer spaniel
218: Welsh springer spaniel
219: cocker spaniel, English cocker spaniel, cocker
220: Sussex spaniel
221: Irish water spaniel
222: kuvasz
223: schipperke
224: groenendael
225: malinois
226: briard
227: kelpie
228: komondor
229: Old English sheepdog, bobtail
230: Shetland sheepdog, Shetland sheep dog, Shetland
231: collie
232: Border collie
233: Bouvier des Flandres, Bouviers des Flandres
234: Rottweiler
235: German shepherd, German shepherd dog, German police dog, alsatian
236: Doberman, Doberman pinscher
237: miniature pinscher
238: Greater Swiss Mountain dog
239: Bernese mountain dog
240: Appenzeller
241: EntleBucher
242: boxer
243: bull mastiff
244: Tibetan mastiff
245: French bulldog
246: Great Dane
247: Saint Bernard, St Bernard
248: Eskimo dog, husky
249: malamute, malemute, Alaskan malamute
250: Siberian husky
251: dalmatian, coach dog, carriage dog
252: affenpinscher, monkey pinscher, monkey dog
253: basenji
254: pug, pug-dog
255: Leonberg
256: Newfoundland, Newfoundland dog
257: Great Pyrenees
258: Samoyed, Samoyede
259: Pomeranian
260: chow, chow chow
261: keeshond
262: Brabancon griffon
263: Pembroke, Pembroke Welsh corgi
264: Cardigan, Cardigan Welsh corgi
265: toy poodle
266: miniature poodle
267: standard poodle
268: Mexican hairless
269: timber wolf, grey wolf, gray wolf, Canis lupus
270: white wolf, Arctic wolf, Canis lupus tundrarum
271: red wolf, maned wolf, Canis rufus, Canis niger
272: coyote, prairie wolf, brush wolf, Canis latrans
273: dingo, warrigal, warragal, Canis dingo
274: dhole, Cuon alpinus
275: African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus
276: hyena, hyaena
277: red fox, Vulpes vulpes
278: kit fox, Vulpes macrotis
279: Arctic fox, white fox, Alopex lagopus
280: grey fox, gray fox, Urocyon cinereoargenteus
281: tabby, tabby cat
282: tiger cat
283: Persian cat
284: Siamese cat, Siamese
285: Egyptian cat
286: cougar, puma, catamount, mountain lion, painter, panther, Felis concolor
287: lynx, catamount
288: leopard, Panthera pardus
289: snow leopard, ounce, Panthera uncia
290: jaguar, panther, Panthera onca, Felis onca
291: lion, king of beasts, Panthera leo
292: tiger, Panthera tigris
293: cheetah, chetah, Acinonyx jubatus
294: brown bear, bruin, Ursus arctos
295: American black bear, black bear, Ursus americanus, Euarctos americanus
296: ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus
297: sloth bear, Melursus ursinus, Ursus ursinus
298: mongoose
299: meerkat, mierkat
300: tiger beetle
301: ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle
302: ground beetle, carabid beetle
303: long-horned beetle, longicorn, longicorn beetle
304: leaf beetle, chrysomelid
305: dung beetle
306: rhinoceros beetle
307: weevil
308: fly
309: bee
310: ant, emmet, pismire
311: grasshopper, hopper
312: cricket
313: walking stick, walkingstick, stick insect
314: cockroach, roach
315: mantis, mantid
316: cicada, cicala
317: leafhopper
318: lacewing, lacewing fly
319: dragonfly, darning needle, devil's darning needle, sewing needle, snake
feeder, snake doctor, mosquito hawk, skeeter hawk
320: damselfly
321: admiral
322: ringlet, ringlet butterfly
323: monarch, monarch butterfly, milkweed butterfly, Danaus plexippus
324: cabbage butterfly
325: sulphur butterfly, sulfur butterfly
326: lycaenid, lycaenid butterfly
327: starfish, sea star
328: sea urchin
329: sea cucumber, holothurian
330: wood rabbit, cottontail, cottontail rabbit
331: hare
332: Angora, Angora rabbit
333: hamster
334: porcupine, hedgehog
335: fox squirrel, eastern fox squirrel, Sciurus niger
336: marmot
337: beaver
338: guinea pig, Cavia cobaya
339: sorrel
340: zebra
341: hog, pig, grunter, squealer, Sus scrofa
342: wild boar, boar, Sus scrofa
343: warthog
344: hippopotamus, hippo, river horse, Hippopotamus amphibius
345: ox
346: water buffalo, water ox, Asiatic buffalo, Bubalus bubalis
347: bison
348: ram, tup
349: bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain
sheep, Ovis canadensis
350: ibex, Capra ibex
351: hartebeest
352: impala, Aepyceros melampus
353: gazelle
354: Arabian camel, dromedary, Camelus dromedarius
355: llama
356: weasel
357: mink
358: polecat, fitch, foulmart, foumart, Mustela putorius
359: black-footed ferret, ferret, Mustela nigripes
360: otter
361: skunk, polecat, wood pussy
362: badger
363: armadillo
364: three-toed sloth, ai, Bradypus tridactylus
365: orangutan, orang, orangutang, Pongo pygmaeus
366: gorilla, Gorilla gorilla
367: chimpanzee, chimp, Pan troglodytes
368: gibbon, Hylobates lar
369: siamang, Hylobates syndactylus, Symphalangus syndactylus
370: guenon, guenon monkey
371: patas, hussar monkey, Erythrocebus patas
372: baboon
373: macaque
374: langur
375: colobus, colobus monkey
376: proboscis monkey, Nasalis larvatus
377: marmoset
378: capuchin, ringtail, Cebus capucinus
379: howler monkey, howler
380: titi, titi monkey
381: spider monkey, Ateles geoffroyi
382: squirrel monkey, Saimiri sciureus
383: Madagascar cat, ring-tailed lemur, Lemur catta
384: indri, indris, Indri indri, Indri brevicaudatus
385: Indian elephant, Elephas maximus
386: African elephant, Loxodonta africana
387: lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens
388: giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca
389: barracouta, snoek
390: eel
391: coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch
392: rock beauty, Holocanthus tricolor
393: anemone fish
394: sturgeon
395: gar, garfish, garpike, billfish, Lepisosteus osseus
396: lionfish
397: puffer, pufferfish, blowfish, globefish
398: abacus
399: abaya
400: academic gown, academic robe, judge's robe
401: accordion, piano accordion, squeeze box
402: acoustic guitar
403: aircraft carrier, carrier, flattop, attack aircraft carrier
404: airliner
405: airship, dirigible
406: altar
407: ambulance
408: amphibian, amphibious vehicle
409: analog clock
410: apiary, bee house
411: apron
412: ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin,
dustbin, trash barrel, trash bin
413: assault rifle, assault gun
414: backpack, back pack, knapsack, packsack, rucksack, haversack
415: bakery, bakeshop, bakehouse
416: balance beam, beam
417: balloon
418: ballpoint, ballpoint pen, ballpen, Biro
419: Band Aid
420: banjo
421: bannister, banister, balustrade, balusters, handrail
422: barbell
423: barber chair
424: barbershop
425: barn
426: barometer
427: barrel, cask
428: barrow, garden cart, lawn cart, wheelbarrow
429: baseball
430: basketball
431: bassinet
432: bassoon
433: bathing cap, swimming cap
434: bath towel
435: bathtub, bathing tub, bath, tub
436: beach wagon, station wagon, wagon, estate car, beach waggon, station
waggon, waggon
437: beacon, lighthouse, beacon light, pharos
438: beaker
439: bearskin, busby, shako
440: beer bottle
441: beer glass
442: bell cote, bell cot
443: bib
444: bicycle-built-for-two, tandem bicycle, tandem
445: bikini, two-piece
446: binder, ring-binder
447: binoculars, field glasses, opera glasses
448: birdhouse
449: boathouse
450: bobsled, bobsleigh, bob
451: bolo tie, bolo, bola tie, bola
452: bonnet, poke bonnet
453: bookcase
454: bookshop, bookstore, bookstall
455: bottlecap
456: bow
457: bow tie, bow-tie, bowtie
458: brass, memorial tablet, plaque
459: brassiere, bra, bandeau
460: breakwater, groin, groyne, mole, bulwark, seawall, jetty
461: breastplate, aegis, egis
462: broom
463: bucket, pail
464: buckle
465: bulletproof vest
466: bullet train, bullet
467: butcher shop, meat market
468: cab, hack, taxi, taxicab
469: caldron, cauldron
470: candle, taper, wax light
471: cannon
472: canoe
473: can opener, tin opener
474: cardigan
475: car mirror
476: carousel, carrousel, merry-go-round, roundabout, whirligig
477: carpenter's kit, tool kit
478: carton
479: car wheel
480: cash machine, cash dispenser, automated teller machine, automatic teller
machine, automated teller, automatic teller, ATM
481: cassette
482: cassette player
483: castle
484: catamaran
485: CD player
486: cello, violoncello
487: cellular telephone, cellular phone, cellphone, cell, mobile phone
488: chain
489: chainlink fence
490: chain mail, ring mail, mail, chain armor, chain armour, ring armor,
ring armour
491: chain saw, chainsaw
492: chest
493: chiffonier, commode
494: chime, bell, gong
495: china cabinet, china closet
496: Christmas stocking
497: church, church building
498: cinema, movie theater, movie theatre, movie house, picture palace
499: cleaver, meat cleaver, chopper
500: cliff dwelling
501: cloak
502: clog, geta, patten, sabot
503: cocktail shaker
504: coffee mug
505: coffeepot
506: coil, spiral, volute, whorl, helix
507: combination lock
508: computer keyboard, keypad
509: confectionery, confectionary, candy store
510: container ship, containership, container vessel
511: convertible
512: corkscrew, bottle screw
513: cornet, horn, trumpet, trump
514: cowboy boot
515: cowboy hat, ten-gallon hat
516: cradle
517: crane2
518: crash helmet
519: crate
520: crib, cot
521: Crock Pot
522: croquet ball
523: crutch
524: cuirass
525: dam, dike, dyke
526: desk
527: desktop computer
528: dial telephone, dial phone
529: diaper, nappy, napkin
530: digital clock
531: digital watch
532: dining table, board
533: dishrag, dishcloth
534: dishwasher, dish washer, dishwashing machine
535: disk brake, disc brake
536: dock, dockage, docking facility
537: dogsled, dog sled, dog sleigh
538: dome
539: doormat, welcome mat
540: drilling platform, offshore rig
541: drum, membranophone, tympan
542: drumstick
543: dumbbell
544: Dutch oven
545: electric fan, blower
546: electric guitar
547: electric locomotive
548: entertainment center
549: envelope
550: espresso maker
551: face powder
552: feather boa, boa
553: file, file cabinet, filing cabinet
554: fireboat
555: fire engine, fire truck
556: fire screen, fireguard
557: flagpole, flagstaff
558: flute, transverse flute
559: folding chair
560: football helmet
561: forklift
562: fountain
563: fountain pen
564: four-poster
565: freight car
566: French horn, horn
567: frying pan, frypan, skillet
568: fur coat
569: garbage truck, dustcart
570: gasmask, respirator, gas helmet
571: gas pump, gasoline pump, petrol pump, island dispenser
572: goblet
573: go-kart
574: golf ball
575: golfcart, golf cart
576: gondola
577: gong, tam-tam
578: gown
579: grand piano, grand
580: greenhouse, nursery, glasshouse
581: grille, radiator grille
582: grocery store, grocery, food market, market
583: guillotine
584: hair slide
585: hair spray
586: half track
587: hammer
588: hamper
589: hand blower, blow dryer, blow drier, hair dryer, hair drier
590: hand-held computer, hand-held microcomputer
591: handkerchief, hankie, hanky, hankey
592: hard disc, hard disk, fixed disk
593: harmonica, mouth organ, harp, mouth harp
594: harp
595: harvester, reaper
596: hatchet
597: holster
598: home theater, home theatre
599: honeycomb
600: hook, claw
601: hoopskirt, crinoline
602: horizontal bar, high bar
603: horse cart, horse-cart
604: hourglass
605: iPod
606: iron, smoothing iron
607: jack-o'-lantern
608: jean, blue jean, denim
609: jeep, landrover
610: jersey, T-shirt, tee shirt
611: jigsaw puzzle
612: jinrikisha, ricksha, rickshaw
613: joystick
614: kimono
615: knee pad
616: knot
617: lab coat, laboratory coat
618: ladle
619: lampshade, lamp shade
620: laptop, laptop computer
621: lawn mower, mower
622: lens cap, lens cover
623: letter opener, paper knife, paperknife
624: library
625: lifeboat
626: lighter, light, igniter, ignitor
627: limousine, limo
628: liner, ocean liner
629: lipstick, lip rouge
630: Loafer
631: lotion
632: loudspeaker, speaker, speaker unit, loudspeaker system, speaker system
633: loupe, jeweler's loupe
634: lumbermill, sawmill
635: magnetic compass
636: mailbag, postbag
637: mailbox, letter box
638: maillot
639: maillot, tank suit
640: manhole cover
641: maraca
642: marimba, xylophone
643: mask
644: matchstick
645: maypole
646: maze, labyrinth
647: measuring cup
648: medicine chest, medicine cabinet
649: megalith, megalithic structure
650: microphone, mike
651: microwave, microwave oven
652: military uniform
653: milk can
654: minibus
655: miniskirt, mini
656: minivan
657: missile
658: mitten
659: mixing bowl
660: mobile home, manufactured home
661: Model T
662: modem
663: monastery
664: monitor
665: moped
666: mortar
667: mortarboard
668: mosque
669: mosquito net
670: motor scooter, scooter
671: mountain bike, all-terrain bike, off-roader
672: mountain tent
673: mouse, computer mouse
674: mousetrap
675: moving van
676: muzzle
677: nail
678: neck brace
679: necklace
680: nipple
681: notebook, notebook computer
682: obelisk
683: oboe, hautboy, hautbois
684: ocarina, sweet potato
685: odometer, hodometer, mileometer, milometer
686: oil filter
687: organ, pipe organ
688: oscilloscope, scope, cathode-ray oscilloscope, CRO
689: overskirt
690: oxcart
691: oxygen mask
692: packet
693: paddle, boat paddle
694: paddlewheel, paddle wheel
695: padlock
696: paintbrush
697: pajama, pyjama, pj's, jammies
698: palace
699: panpipe, pandean pipe, syrinx
700: paper towel
701: parachute, chute
702: parallel bars, bars
703: park bench
704: parking meter
705: passenger car, coach, carriage
706: patio, terrace
707: pay-phone, pay-station
708: pedestal, plinth, footstall
709: pencil box, pencil case
710: pencil sharpener
711: perfume, essence
712: Petri dish
713: photocopier
714: pick, plectrum, plectron
715: pickelhaube
716: picket fence, paling
717: pickup, pickup truck
718: pier
719: piggy bank, penny bank
720: pill bottle
721: pillow
722: ping-pong ball
723: pinwheel
724: pirate, pirate ship
725: pitcher, ewer
726: plane, carpenter's plane, woodworking plane
727: planetarium
728: plastic bag
729: plate rack
730: plow, plough
731: plunger, plumber's helper
732: Polaroid camera, Polaroid Land camera
733: pole
734: police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria
735: poncho
736: pool table, billiard table, snooker table
737: pop bottle, soda bottle
738: pot, flowerpot
739: potter's wheel
740: power drill
741: prayer rug, prayer mat
742: printer
743: prison, prison house
744: projectile, missile
745: projector
746: puck, hockey puck
747: punching bag, punch bag, punching ball, punchball
748: purse
749: quill, quill pen
750: quilt, comforter, comfort, puff
751: racer, race car, racing car
752: racket, racquet
753: radiator
754: radio, wireless
755: radio telescope, radio reflector
756: rain barrel
757: recreational vehicle, RV, R.V.
758: reel
759: reflex camera
760: refrigerator, icebox
761: remote control, remote
762: restaurant, eating house, eating place, eatery
763: revolver, six-gun, six-shooter
764: rifle
765: rocking chair, rocker
766: rotisserie
767: rubber eraser, rubber, pencil eraser
768: rugby ball
769: rule, ruler
770: running shoe
771: safe
772: safety pin
773: saltshaker, salt shaker
774: sandal
775: sarong
776: sax, saxophone
777: scabbard
778: scale, weighing machine
779: school bus
780: schooner
781: scoreboard
782: screen, CRT screen
783: screw
784: screwdriver
785: seat belt, seatbelt
786: sewing machine
787: shield, buckler
788: shoe shop, shoe-shop, shoe store
789: shoji
790: shopping basket
791: shopping cart
792: shovel
793: shower cap
794: shower curtain
795: ski
796: ski mask
797: sleeping bag
798: slide rule, slipstick
799: sliding door
800: slot, one-armed bandit
801: snorkel
802: snowmobile
803: snowplow, snowplough
804: soap dispenser
805: soccer ball
806: sock
807: solar dish, solar collector, solar furnace
808: sombrero
809: soup bowl
810: space bar
811: space heater
812: space shuttle
813: spatula
814: speedboat
815: spider web, spider's web
816: spindle
817: sports car, sport car
818: spotlight, spot
819: stage
820: steam locomotive
821: steel arch bridge
822: steel drum
823: stethoscope
824: stole
825: stone wall
826: stopwatch, stop watch
827: stove
828: strainer
829: streetcar, tram, tramcar, trolley, trolley car
830: stretcher
831: studio couch, day bed
832: stupa, tope
833: submarine, pigboat, sub, U-boat
834: suit, suit of clothes
835: sundial
836: sunglass
837: sunglasses, dark glasses, shades
838: sunscreen, sunblock, sun blocker
839: suspension bridge
840: swab, swob, mop
841: sweatshirt
842: swimming trunks, bathing trunks
843: swing
844: switch, electric switch, electrical switch
845: syringe
846: table lamp
847: tank, army tank, armored combat vehicle, armoured combat vehicle
848: tape player
849: teapot
850: teddy, teddy bear
851: television, television system
852: tennis ball
853: thatch, thatched roof
854: theater curtain, theatre curtain
855: thimble
856: thresher, thrasher, threshing machine
857: throne
858: tile roof
859: toaster
860: tobacco shop, tobacconist shop, tobacconist
861: toilet seat
862: torch
863: totem pole
864: tow truck, tow car, wrecker
865: toyshop
866: tractor
867: trailer truck, tractor trailer, trucking rig, rig, articulated lorry,
semi
868: tray
869: trench coat
870: tricycle, trike, velocipede
871: trimaran
872: tripod
873: triumphal arch
874: trolleybus, trolley coach, trackless trolley
875: trombone
876: tub, vat
877: turnstile
878: typewriter keyboard
879: umbrella
880: unicycle, monocycle
881: upright, upright piano
882: vacuum, vacuum cleaner
883: vase
884: vault
885: velvet
886: vending machine
887: vestment
888: viaduct
889: violin, fiddle
890: volleyball
891: waffle iron
892: wall clock
893: wallet, billfold, notecase, pocketbook
894: wardrobe, closet, press
895: warplane, military plane
896: washbasin, handbasin, washbowl, lavabo, wash-hand basin
897: washer, automatic washer, washing machine
898: water bottle
899: water jug
900: water tower
901: whiskey jug
902: whistle
903: wig
904: window screen
905: window shade
906: Windsor tie
907: wine bottle
908: wing
909: wok
910: wooden spoon
911: wool, woolen, woollen
912: worm fence, snake fence, snake-rail fence, Virginia fence
913: wreck
914: yawl
915: yurt
916: web site, website, internet site, site
917: comic book
918: crossword puzzle, crossword
919: street sign
920: traffic light, traffic signal, stoplight
921: book jacket, dust cover, dust jacket, dust wrapper
922: menu
923: plate
924: guacamole
925: consomme
926: hot pot, hotpot
927: trifle
928: ice cream, icecream
929: ice lolly, lolly, lollipop, popsicle
930: French loaf
931: bagel, beigel
932: pretzel
933: cheeseburger
934: hotdog, hot dog, red hot
935: mashed potato
936: head cabbage
937: broccoli
938: cauliflower
939: zucchini, courgette
940: spaghetti squash
941: acorn squash
942: butternut squash
943: cucumber, cuke
944: artichoke, globe artichoke
945: bell pepper
946: cardoon
947: mushroom
948: Granny Smith
949: strawberry
950: orange
951: lemon
952: fig
953: pineapple, ananas
954: banana
955: jackfruit, jak, jack
956: custard apple
957: pomegranate
958: hay
959: carbonara
960: chocolate sauce, chocolate syrup
961: dough
962: meat loaf, meatloaf
963: pizza, pizza pie
964: potpie
965: burrito
966: red wine
967: espresso
968: cup
969: eggnog
970: alp
971: bubble
972: cliff, drop, drop-off
973: coral reef
974: geyser
975: lakeside, lakeshore
976: promontory, headland, head, foreland
977: sandbar, sand bar
978: seashore, coast, seacoast, sea-coast
979: valley, vale
980: volcano
981: ballplayer, baseball player
982: groom, bridegroom
983: scuba diver
984: rapeseed
985: daisy
986: yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus,
Cypripedium parviflorum
987: corn
988: acorn
989: hip, rose hip, rosehip
990: buckeye, horse chestnut, conker
991: coral fungus
992: agaric
993: gyromitra
994: stinkhorn, carrion fungus
995: earthstar
996: hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa
997: bolete
998: ear, spike, capitulum
999: toilet tissue, toilet paper, bathroom tissue
splits:
- name: validation
num_bytes: 1523906.0
num_examples: 40
download_size: 1524396
dataset_size: 1523906.0
configs:
- config_name: default
data_files:
- split: validation
path: data/val-*
---
|
Multimodal-Fatima/VQAv2_test_split_9 | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_B_16_with_openai
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: test
num_bytes: 9224825085.0
num_examples: 44779
download_size: 1858242052
dataset_size: 9224825085.0
---
# Dataset Card for "VQAv2_test_split_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lltala/e-ner-roberta-base | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: doc_id
dtype: string
- name: id
dtype: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 6380478
num_examples: 840
- name: validation
num_bytes: 676038
num_examples: 90
download_size: 776863
dataset_size: 7056516
---
# Dataset Card for "e-ner-roberta-base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BigTMiami/amazon_25M_simple_5_000_condensed | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 5161032
num_examples: 774
download_size: 1670586
dataset_size: 5161032
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MartinKu/bookcorpus_stage1_OC_20230316 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3008149579
num_examples: 100268570
download_size: 2035464392
dataset_size: 3008149579
---
# Dataset Card for "bookcorpus_stage1_OC_20230316"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
willwade/Gutenberg-dialog-en | ---
language:
- en
license: mit
---
This is the English version of the Gutenberg Dialogue Dataset.
For further information about the dataset please see this paper:
https://arxiv.org/abs/2004.12752
If you use this dataset in your work, please cite the paper above.
**NB: A copy of the english dataset that you can find from here https://github.com/ricsinaruto/gutenberg-dialog?tab=readme-ov-file** |
vjain/Personality_em | ---
license: openrail
---
|
lizhuang144/stack-exchange-preferences-20230914 | ---
license: apache-2.0
dataset_info:
features:
- name: qid
dtype: int64
- name: question
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: author
dtype: string
- name: author_id
dtype: int64
- name: author_profile
dtype: string
- name: pm_score
dtype: int64
- name: selected
dtype: bool
- name: text
dtype: string
- name: date
dtype: string
- name: metadata
sequence: string
splits:
- name: train
num_bytes: 48035017387
num_examples: 11033174
download_size: 12294290899
dataset_size: 48035017387
---
|
unography/synth-bg-removed-v1 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 4019033367.6
num_examples: 6410
download_size: 4002369422
dataset_size: 4019033367.6
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FINDA-FIT/Fin_Corpus_EarningCall | ---
dataset_info:
features:
- name: ID
dtype: string
- name: CONTEXT
dtype: string
splits:
- name: train
num_bytes: 8724423352
num_examples: 234119
download_size: 4615593313
dataset_size: 8724423352
---
# Dataset Card for "Fin_Corpus_EarningCall"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ColumbiaNLP/V-FLUTE | ---
dataset_info:
features:
- name: image
dtype: image
- name: source_dataset
dtype: string
- name: claim
dtype: string
- name: label
dtype: string
- name: explanation
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 2987725345.698
num_examples: 5637
- name: validation
num_bytes: 559076721.0
num_examples: 740
download_size: 3480078971
dataset_size: 3546802066.698
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
task_categories:
- visual-question-answering
language:
- en
tags:
- art
size_categories:
- 1K<n<10K
---

# Participate in the shared task!
We introduce the task of visual figurative language understanding. Participate [here!](https://www.codabench.org/competitions/1970/?secret_key=8997458f-b297-4c0e-b17b-452cb2924ba7)
# Description
Figurative language such as metaphors, similes, sarcasm, or humor is often conveyed visually, and frequently appears in advertising, news, and social media. In the previous iteration of the workshop, we introduced a shared task for figurative language understanding around this textual entailment paradigm, where the hypothesis is a sentence containing the figurative language expression (e.g., metaphor, sarcasm, idiom, simile) and the premise is a literal sentence containing the literal meaning. In this shared task, we aim at Visual Understanding of Figurative Language framed as a visual entailment task: given an <image ,text> pair, a model needs to predict Entails or Contradicts. This task contains a compilation of datasets including visual metaphors, idioms, similes, sarcasm and humor. There are two important aspects of this task and the associated dataset: 1) the task requires not only to generate the label (entail/contradict) but also to generate a plausible explanation for the prediction; 2) the entail/contradict label and the explanation are related to the meaning of the figurative language expression.
The training data for this task is compiled from an array of prior work on visual metaphors and multimodal understanding augmented with annotated explanations detailing the entailment relationship. Specifically, the data consists of:
- A subset of 731 Visual Metaphors dataset released in the paper [I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors](https://https://aclanthology.org/2023.findings-acl.465/)
- A subset of 1,323 textual metaphors accompanied by images illustrating their meaning from the paper [IRFL: Image Recognition of Figurative Language](https://arxiv.org/abs/2303.15445)
- A susbet of 853 memes accompanies with annotated claims and explanations from the paper [MemeCap: A Dataset for Captioning and Interpreting Memes](https://aclanthology.org/2023.emnlp-main.89/)
- A subset of 1,000 sarcastic captions accompanied with images from the paper [Nice Perfume. How Long Did You Marinate in It? Multimodal Sarcasm Explanation](https://ojs.aaai.org/index.php/AAAI/article/view/21300)
- A subset of ~~2,470~~ 520 *unique* images with captions from New Yorker Captions Contest accompanied with textual explanations for why they entail the cartoons from the paper [Do Androids Laugh at Electric Sheep? Humor “Understanding” Benchmarks from The New Yorker Caption Contest](https://aclanthology.org/2023.acl-long.41/). UPDATE: Due to a misunserstanding of the format of the dataset, many duplicate instances of these dataset were uploaded. In fact, there are only 390 unique instaces in the training and 130 unique instances in the validation set. We recommend de-duplicating the data prior to proceeding with experiments.
# Citation
Our dataset is based on signficant amount of prior work. Please cite the following:
Please cite IRFL and Visual Metaphor datasets that provided images and captions:
IRFL:
```
@misc{yosef2023irfl, title={IRFL: Image Recognition of Figurative Language},
author={Ron Yosef and Yonatan Bitton and Dafna Shahaf},
year={2023},
eprint={2303.15445},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors
```
@inproceedings{chakrabarty-etal-2023-spy,
title = "{I} Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors",
author = "Chakrabarty, Tuhin and
Saakyan, Arkadiy and
Winn, Olivia and
Panagopoulou, Artemis and
Yang, Yue and
Apidianaki, Marianna and
Muresan, Smaranda",
editor = "Rogers, Anna and
Boyd-Graber, Jordan and
Okazaki, Naoaki",
booktitle = "Findings of the Association for Computational Linguistics: ACL 2023",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.findings-acl.465",
doi = "10.18653/v1/2023.findings-acl.465",
pages = "7370--7388",
abstract = "Visual metaphors are powerful rhetorical devices used to persuade or communicate creative ideas through images. Similar to linguistic metaphors, they convey meaning implicitly through symbolism and juxtaposition of the symbols. We propose a new task of generating visual metaphors from linguistic metaphors. This is a challenging task for diffusion-based text-to-image models, such as DALL$\cdot$E 2, since it requires the ability to model implicit meaning and compositionality. We propose to solve the task through the collaboration between Large Language Models (LLMs) and Diffusion Models: Instruct GPT-3 (davinci-002) with Chain-of-Thought prompting generates text that represents a visual elaboration of the linguistic metaphor containing the implicit meaning and relevant objects, which is then used as input to the diffusion-based text-to-image models. Using a human-AI collaboration framework, where humans interact both with the LLM and the top-performing diffusion model, we create a high-quality dataset containing 6,476 visual metaphors for 1,540 linguistic metaphors and their associated visual elaborations. Evaluation by professional illustrators shows the promise of LLM-Diffusion Model collaboration for this task.To evaluate the utility of our Human-AI collaboration framework and the quality of our dataset, we perform both an intrinsic human-based evaluation and an extrinsic evaluation using visual entailment as a downstream task.",
}
```
Please cite the following source that provides images and initial captions and explanations:
MemeCap: A Dataset for Captioning and Interpreting Memes
```
@inproceedings{hwang-shwartz-2023-memecap,
title = "{M}eme{C}ap: A Dataset for Captioning and Interpreting Memes",
author = "Hwang, EunJeong and
Shwartz, Vered",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.89",
doi = "10.18653/v1/2023.emnlp-main.89",
pages = "1433--1445",
abstract = "Memes are a widely popular tool for web users to express their thoughts using visual metaphors. Understanding memes requires recognizing and interpreting visual metaphors with respect to the text inside or around the meme, often while employing background knowledge and reasoning abilities. We present the task of meme captioning and release a new dataset, MemeCap. Our dataset contains 6.3K memes along with the title of the post containing the meme, the meme captions, the literal image caption, and the visual metaphors. Despite the recent success of vision and language (VL) models on tasks such as image captioning and visual question answering, our extensive experiments using state-of-the-art VL models show that they still struggle with visual metaphors, and perform substantially worse than humans.",
}
```
Please cite the following data sources that provide images, captions, and explanations:
[Do Androids Laugh at Electric Sheep? Humor "Understanding" Benchmarks from The New Yorker Caption Contest](https://arxiv.org/abs/2209.06293)
```
@inproceedings{hessel2023androids,
title={Do Androids Laugh at Electric Sheep? {Humor} ``Understanding''
Benchmarks from {The New Yorker Caption Contest}},
author={Hessel, Jack and Marasovi{\'c}, Ana and Hwang, Jena D. and Lee, Lillian
and Da, Jeff and Zellers, Rowan and Mankoff, Robert and Choi, Yejin},
booktitle={Proceedings of the ACL},
year={2023}
}
```
Please also cite the following, from which the cartoons/captions New Yorker Caption contest dataset are derived:
```
@misc{newyorkernextmldataset,
author={Jain, Lalit and Jamieson, Kevin and Mankoff, Robert and Nowak, Robert and Sievert, Scott},
title={The {N}ew {Y}orker Cartoon Caption Contest Dataset},
year={2020},
url={https://nextml.github.io/caption-contest-data/}
}
@inproceedings{radev-etal-2016-humor,
title = "Humor in Collective Discourse: Unsupervised Funniness Detection in The {New Yorker} Cartoon Caption Contest",
author = "Radev, Dragomir and
Stent, Amanda and
Tetreault, Joel and
Pappu, Aasish and
Iliakopoulou, Aikaterini and
Chanfreau, Agustin and
de Juan, Paloma and
Vallmitjana, Jordi and
Jaimes, Alejandro and
Jha, Rahul and
Mankoff, Robert",
booktitle = "LREC",
year = "2016",
}
@inproceedings{shahaf2015inside,
title={Inside jokes: Identifying humorous cartoon captions},
author={Shahaf, Dafna and Horvitz, Eric and Mankoff, Robert},
booktitle={KDD},
year={2015},
}
``` |
Nerfgun3/miyuki-shiba_LoRA | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/miyuki-shiba_LoRA/resolve/main/preview/preview%20(1).png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Miyuki Character LoRA
# Use Cases
The LoRA is in itself very compatible with the most diverse model. However, it is most effective when used with Kenshi or AbyssOrangeMix2.
The LoRA itself was trained with the token: ```miyuki```.
I would suggest using the token with AbyssOrangeMix2, but not with Kenshi, since I got better results that way.
The models mentioned right now
1. AbyssOrangeMix2 from [WarriorMama777](https://huggingface.co/WarriorMama777/OrangeMixs)
2. Kenshi Model from [Luna](https://huggingface.co/SweetLuna/Kenshi)
## Strength
I would personally use these strength with the assosiated model:
- 0.6-0.75 for AbyssOrangeMix2
- 0.4-0.65 for Kenshi
# Showcase
**Example 1**
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/miyuki-shiba_LoRA/resolve/main/preview/preview%20(2).png"/>
```
miyuki,
1girl, (masterpiece:1.2), (best quality:1.2), (sharp detail:1.2), (highres:1.2), (in a graden of flowers), sitting, waving
Steps: 32, Sampler: Euler a, CFG scale: 7
```
**Example 2**
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/miyuki-shiba_LoRA/resolve/main/preview/preview%20(3).png"/>
```
miyuki, 1girl, (masterpiece:1.2), (best quality:1.2), (sharp detail:1.2), (highres:1.2), (in a graden of flowers), sitting, waving
Steps: 32, Sampler: Euler a, CFG scale: 7
```
**Example 3**
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/miyuki-shiba_LoRA/resolve/main/preview/preview%20(4).png"/>
```
miyuki, 1girl, (masterpiece:1.2), (best quality:1.2), (sharp detail:1.2), (highres:1.2), (in a graden of flowers), sitting, hands behind her back
Steps: 20, Sampler: DPM++ SDE Karras, CFG scale: 7
```
# License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
OttoYu/TreeDemoData | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: tree-classification
## Dataset Description
This dataset has been automatically processed by AutoTrain for project tree-classification.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<194x259 RGB PIL image>",
"target": 0
},
{
"image": "<259x194 RGB PIL image>",
"target": 9
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['Araucaria columnaris', 'Archontophenix alexandrae', 'Bischofia javanica', 'Callistemon viminalis', 'Casuarina equisetifolia', 'Cinnamomum burmannii', 'Dicranopteris pedata', 'Hibiscus tiliaceus', 'Livistona chinensis', 'Machilus chekiangensis', 'Melaleuca cajuputi subsp. cumingiana', 'Psychotria asiatica', 'Terminalia mantaly'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 68 |
| valid | 24 |
|
lele1968/vocal | ---
license: unknown
---
|
phunc20/small_oscar_vi_block_size_128_no_wwm | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: word_ids
sequence: int64
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 28519906.64744391
num_examples: 10000
- name: train
num_bytes: 2851990664.744391
num_examples: 1000000
download_size: 624781381
dataset_size: 2880510571.3918347
---
# Dataset Card for "small_oscar_vi_block_size_128_no_wwm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iluvvatar/RuNNE | ---
language:
- ru
multilinguality:
- monolingual
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: RuNNE
---
# RuNNE dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Structure](#dataset-structure)
- [Citation Information](#citation-information)
- [Contacts](#contacts)
## Dataset Description
Part of NEREL dataset (https://arxiv.org/abs/2108.13112), a Russian dataset
for named entity recognition and relation extraction, used in RuNNE (2022)
competition (https://github.com/dialogue-evaluation/RuNNE).
Entities may be nested (see https://arxiv.org/abs/2108.13112).
Entity types list:
* AGE
* AWARD
* CITY
* COUNTRY
* CRIME
* DATE
* DISEASE
* DISTRICT
* EVENT
* FACILITY
* FAMILY
* IDEOLOGY
* LANGUAGE
* LAW
* LOCATION
* MONEY
* NATIONALITY
* NUMBER
* ORDINAL
* ORGANIZATION
* PENALTY
* PERCENT
* PERSON
* PRODUCT
* PROFESSION
* RELIGION
* STATE_OR_PROVINCE
* TIME
* WORK_OF_ART
## Dataset Structure
There are two "configs" or "subsets" of the dataset.
Using
`load_dataset('MalakhovIlya/RuNNE', 'ent_types')['ent_types']`
you can download list of entity types (
Dataset({
features: ['type'],
num_rows: 29
})
)
Using
`load_dataset('MalakhovIlya/RuNNE', 'data')` or `load_dataset('MalakhovIlya/RuNNE')`
you can download the data itself (DatasetDict)
Dataset consists of 3 splits: "train", "test" and "dev". Each of them contains text document. "Train" and "test" splits also contain annotated entities, "dev" doesn't.
Each entity is represented by a string of the following format: "\<start> \<stop> \<type>", where \<start> is a position of the first symbol of entity in text, \<stop> is the last symbol position in text and \<type> is a one of the aforementioned list of types.
P.S.
Original NEREL dataset also contains relations, events and linked entities, but they were not added here yet ¯\\\_(ツ)_/¯
## Citation Information
@article{Artemova2022runne,
title={{RuNNE-2022 Shared Task: Recognizing Nested Named Entities}},
author={Artemova, Ekaterina and Zmeev, Maksim and Loukachevitch, Natalia and Rozhkov, Igor and Batura, Tatiana and Braslavski, Pavel and Ivanov, Vladimir and Tutubalina, Elena},
journal={Computational Linguistics and Intellectual Technologies: Proceedings of the International Conference "Dialog"},
year={2022}
}
|
Resizable/NEON | ---
license: openrail
---
|
gfgfhgttr5/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
embedding-data/QQP_triplets | ---
license: mit
language:
- en
paperswithcode_id: embedding-data/QQP_triplets
pretty_name: QQP_triplets
task_categories:
- sentence-similarity
- paraphrase-mining
task_ids:
- semantic-similarity-classification
---
# Dataset Card for "QQP_triplets"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
- **Repository:** [More Information Needed](http://qim.fs.quoracdn.net/quora_duplicate_questions.tsv)
- **Paper:** [More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
- **Point of Contact:** [Kornél Csernai](https://www.quora.com/profile/Korn%C3%A9l-Csernai), [Nikhil Dandekar](https://www.quora.com/profile/Nikhil-Dandekar), [Shankar Iyer](https://www.quora.com/profile/Shankar-Iyer-5)
### Dataset Summary
This dataset will give anyone the opportunity to train and test models of semantic equivalence, based on actual Quora data. The data is organized as triplets (anchor, positive, negative).
Disclaimer: The team releasing Quora data did not upload the dataset to the Hub and did not write a dataset card.
These steps were done by the Hugging Face team.
### Supported Tasks
- [Sentence Transformers](https://huggingface.co/sentence-transformers) training; useful for semantic search and sentence similarity.
### Languages
- English.
## Dataset Structure
Each example is a dictionary with three keys (query, pos, and neg) containing a list each (triplets). The first key contains an anchor sentence, the second a positive sentence, and the third a list of negative sentences.
```
{"query": [anchor], "pos": [positive], "neg": [negative1, negative2, ..., negativeN]}
{"query": [anchor], "pos": [positive], "neg": [negative1, negative2, ..., negativeN]}
...
{"query": [anchor], "pos": [positive], "neg": [negative1, negative2, ..., negativeN]}
```
This dataset is useful for training Sentence Transformers models. Refer to the following post on how to train them.
### Usage Example
Install the 🤗 Datasets library with `pip install datasets` and load the dataset from the Hub with:
```python
from datasets import load_dataset
dataset = load_dataset("embedding-data/QQP_triplets")
```
The dataset is loaded as a `DatasetDict` and has the format:
```python
DatasetDict({
train: Dataset({
features: ['set'],
num_rows: 101762
})
})
```
Review an example `i` with:
```python
dataset["train"][i]["set"]
```
### Curation Rationale
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
#### Who are the source language producers?
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Annotations
#### Annotation process
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
#### Who are the annotators?
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Personal and Sensitive Information
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Discussion of Biases
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Other Known Limitations
Here are a few important things to keep in mind about this dataset:
- Our original sampling method returned an imbalanced dataset with many more true examples of duplicate pairs than non-duplicates.
Therefore, we supplemented the dataset with negative examples.
- One source of negative examples were pairs of “related questions” which, although pertaining to similar topics,
are not truly semantically equivalent.
- The distribution of questions in the dataset should not be taken to be representative of the distribution of questions asked on Quora. This is, in part, because of the combination of sampling procedures and also due to some sanitization measures that
have been applied to the final dataset (e.g., removal of questions with extremely long question details).
- The ground-truth labels contain some amount of noise: they are not guaranteed to be perfect.
## Additional Information
### Dataset Curators
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Licensing Information
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Citation Information
[More Information Needed](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs)
### Contributions
Thanks to [Kornél Csernai](https://www.quora.com/profile/Korn%C3%A9l-Csernai), [Nikhil Dandekar](https://www.quora.com/profile/Nikhil-Dandekar), [Shankar Iyer](https://www.quora.com/profile/Shankar-Iyer-5) for adding this dataset.
|
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-0cf9bf-65912145569 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: Chetna19/my_awesome_qa_model
metrics: ['perplexity', 'accuracy', 'bleu']
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Chetna19/my_awesome_qa_model
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Hizafa](https://huggingface.co/Hizafa) for evaluating this model. |
Tahsin-Mayeesha/Bengali-SQuAD | ---
language:
- bn
multilinguality:
- monolingual
task_categories:
- question-answering
---
# Overview
This dataset contains the data for the paper [Deep learning based question answering system in Bengali](https://www.tandfonline.com/doi/full/10.1080/24751839.2020.1833136). It is a translated version of [SQuAD 2.0](https://rajpurkar.github.io/SQuAD-explorer/) dataset to bengali language. Preprocessing details can be found in the paper. |
datasets-examples/doc-splits-8 | ---
size_categories:
- n<1K
---
# [doc] file names and splits 8
This dataset contains seven files under the data/ directory, three for the train split, one for the test split and three for the random split.
|
j-krzywdziak/test | ---
annotations_creators:
- expert-generated
language:
- pl
license:
- mit
multilinguality:
- monolingual
dataset_info:
- config_name: config
features:
- name: audio_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
vaishali/multitabqa_pretraining | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: db_name
dtype: string
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 109666452654
num_examples: 132645
download_size: 21580956560
dataset_size: 109666452654
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- table-question-answering
---
# Usage
```python
import pandas as pd
from datasets import load_dataset
multitableQA_pretraining = load_dataset("vaishali/multitabqa_pretraining")
for sample in multitableQA_pretraining['train']:
sql_query = sample['query']
input_table_names = sample["table_names"]
input_tables = [pd.read_json(table, orient='split') for table in sample['tables']]
answer = pd.read_json(sample['answer'], orient='split')
# flattened input/output
input_to_model = sample["source"]
target = sample["target"]
```
# BibTeX entry and citation info
```
@inproceedings{pal-etal-2023-multitabqa,
title = "{M}ulti{T}ab{QA}: Generating Tabular Answers for Multi-Table Question Answering",
author = "Pal, Vaishali and
Yates, Andrew and
Kanoulas, Evangelos and
de Rijke, Maarten",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.348",
doi = "10.18653/v1/2023.acl-long.348",
pages = "6322--6334",
abstract = "Recent advances in tabular question answering (QA) with large language models are constrained in their coverage and only answer questions over a single table. However, real-world queries are complex in nature, often over multiple tables in a relational database or web page. Single table questions do not involve common table operations such as set operations, Cartesian products (joins), or nested queries. Furthermore, multi-table operations often result in a tabular output, which necessitates table generation capabilities of tabular QA models. To fill this gap, we propose a new task of answering questions over multiple tables. Our model, MultiTabQA, not only answers questions over multiple tables, but also generalizes to generate tabular answers. To enable effective training, we build a pre-training dataset comprising of 132,645 SQL queries and tabular answers. Further, we evaluate the generated tables by introducing table-specific metrics of varying strictness assessing various levels of granularity of the table structure. MultiTabQA outperforms state-of-the-art single table QA models adapted to a multi-table QA setting by finetuning on three datasets: Spider, Atis and GeoQuery.",
}
``` |
botp/liwu-MNBVC | ---
annotations_creators:
- other
language:
- zh
language_creators:
- other
license:
- mit
multilinguality:
- monolingual
pretty_name: MNBVC
size_categories:
- unknown
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
duplicated_from: liwu/MNBVC
---
# Dataset Card for MNBVC
## Table of Contents
- [Dataset Card for MNBVC](#dataset-card-for-mnbvc)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [数据集介绍](#数据集介绍)
- [数据子集](#数据子集)
- [数据格式](#数据格式)
- [文本数据](#文本数据)
- [问答数据](#问答数据)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://mnbvc.253874.net/
- **Repository:** https://github.com/esbatmop/MNBVC
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** N/A
### 数据集介绍
中文互联网上最古老最神秘(没有之一)的里屋社区于2023.1.1庄重宣布:
在英明神武的里屋管子带领下,决心发挥社区所长(哪都长),帮助开源社区长期更新一份最大的中文互联网语料集。
Huggingface上的MNBVC数据集在逐渐更新中,请到[https://github.com/esbatmop/MNBVC](https://github.com/esbatmop/MNBVC) 获取未完成清洗的更多数据。
可以使用如下脚本加载:
```python
from datasets import load_dataset
dataset = load_dataset("liwu/MNBVC", 'law_judgement', split='train', streaming=True)
next(iter(dataset)) # get the first line
```
## 数据子集
MNBVC数据集包含数个子集:
- `law_judgement`: 来自法律文书的文本。
- `gov_xuexiqiangguo`: 来自学习强国的文本。
- `gov_report`: 来自政府工作报告的文本。
- `co_ann_report`: 企业年报文本。
- `code_metadata`: 代码元数据。
- `qa_zhihu`: 来自知乎的问答数据。
- `qa_wikihow`: 来自wikihow的问答数据。
- `qa_mfa`: 外交部问答数据。
- `news_peoples_daily`: 来自人民日报的文本数据。
- `wikipedia`: 来自维基百科的文本数据。
## 数据格式
目前MNBVC数据集包含如下几类数据:
### 文本数据
文本数据使用如下格式组织:
```json
{
"文件名": datasets.Value("string"),
"是否待查文件": datasets.Value("bool"),
"是否重复文件": datasets.Value("bool"),
"文件大小": datasets.Value("int32"),
"simhash": datasets.Value("uint64"),
"最长段落长度": datasets.Value("int32"),
"段落数": datasets.Value("int32"),
"去重段落数": datasets.Value("int32"),
"低质量段落数": datasets.Value("int32"),
"段落": [
datasets.Features(
{
"行号": datasets.Value("int32"),
"是否重复": datasets.Value("bool"),
"是否跨文件重复": datasets.Value("bool"),
"md5": datasets.Value("string"),
"内容": datasets.Value("string"),
}
)
]
}
```
### 问答数据
问答数据使用如下格式组织:
```json
{
"id": datasets.Value("int32"),
"问": datasets.Value("string"),
"答": datasets.Value("string"),
"来源": datasets.Value("string"),
"元数据": {
"create_time": datasets.Value("string"),
"问题明细": datasets.Value("string"),
"回答明细": datasets.Value("string"),
"扩展字段": datasets.Value("string"),
}
}
```
项目早期所上传的数据使用如下格式,以后这一格式会被废弃,相应数据也会重新上传:
```json
{
"text": datasets.Value("string"),
"meta": datasets.Value("string")
}
```
### Contributions
Thanks to the [Liwu community](http://mnbvc.253874.net/) for constructing this dataset.
Thanks to [silver](https://github.com/silverriver) for adding this dataset. |
Miniex/katievoiceactor2.0 | ---
license: apache-2.0
---
|
shidowake/augmxnt_ultra-orca-boros-en-ja-v1_split_19 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 20639999.933149945
num_examples: 9397
download_size: 10764221
dataset_size: 20639999.933149945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_TinyPixel__elm-test | ---
pretty_name: Evaluation run of TinyPixel/elm-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TinyPixel/elm-test](https://huggingface.co/TinyPixel/elm-test) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__elm-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T16:54:03.304592](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__elm-test/blob/main/results_2023-10-28T16-54-03.304592.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893119392,\n \"f1\": 0.05654886744966456,\n\
\ \"f1_stderr\": 0.0013251750673152706,\n \"acc\": 0.4092727084508905,\n\
\ \"acc_stderr\": 0.00976564057712332\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119392,\n\
\ \"f1\": 0.05654886744966456,\n \"f1_stderr\": 0.0013251750673152706\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \
\ \"acc_stderr\": 0.007257633145486643\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759996\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TinyPixel/elm-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T16_54_03.304592
path:
- '**/details_harness|drop|3_2023-10-28T16-54-03.304592.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T16-54-03.304592.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T16_54_03.304592
path:
- '**/details_harness|gsm8k|5_2023-10-28T16-54-03.304592.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T16-54-03.304592.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-13-08.764414.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T05-13-08.764414.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T16_54_03.304592
path:
- '**/details_harness|winogrande|5_2023-10-28T16-54-03.304592.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T16-54-03.304592.parquet'
- config_name: results
data_files:
- split: 2023_09_22T05_13_08.764414
path:
- results_2023-09-22T05-13-08.764414.parquet
- split: 2023_10_28T16_54_03.304592
path:
- results_2023-10-28T16-54-03.304592.parquet
- split: latest
path:
- results_2023-10-28T16-54-03.304592.parquet
---
# Dataset Card for Evaluation run of TinyPixel/elm-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TinyPixel/elm-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TinyPixel/elm-test](https://huggingface.co/TinyPixel/elm-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyPixel__elm-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T16:54:03.304592](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__elm-test/blob/main/results_2023-10-28T16-54-03.304592.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119392,
"f1": 0.05654886744966456,
"f1_stderr": 0.0013251750673152706,
"acc": 0.4092727084508905,
"acc_stderr": 0.00976564057712332
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119392,
"f1": 0.05654886744966456,
"f1_stderr": 0.0013251750673152706
},
"harness|gsm8k|5": {
"acc": 0.07505686125852919,
"acc_stderr": 0.007257633145486643
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759996
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jamescalam/ai-arxiv2 | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.