id stringlengths 2 115 | author stringlengths 2 42 ⌀ | last_modified timestamp[us, tz=UTC] | downloads int64 0 8.87M | likes int64 0 3.84k | paperswithcode_id stringlengths 2 45 ⌀ | tags list | lastModified timestamp[us, tz=UTC] | createdAt stringlengths 24 24 | key stringclasses 1 value | created timestamp[us] | card stringlengths 1 1.01M | embedding list | library_name stringclasses 21 values | pipeline_tag stringclasses 27 values | mask_token null | card_data null | widget_data null | model_index null | config null | transformers_info null | spaces null | safetensors null | transformersInfo null | modelId stringlengths 5 111 ⌀ | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
autoevaluate/autoeval-staging-eval-project-sms_spam-216c1ded-12215630 | autoevaluate | 2022-08-02T10:41:15Z | 73 | 0 | null | [
"autotrain",
"evaluation",
"region:us"
] | 2022-08-02T10:41:15Z | 2022-08-02T10:40:39.000Z | 2022-08-02T10:40:39 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- sms_spam
eval_info:
task: binary_classification
model: Rhuax/MiniLMv2-L12-H384-distilled-finetuned-spam-detection
metrics: []
dataset_name: sms_spam
dataset_config: plain_text
dataset_split: train
col_mapping:
text: sms
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: Rhuax/MiniLMv2-L12-H384-distilled-finetuned-spam-detection
* Dataset: sms_spam
* Config: plain_text
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Al-Ip](https://huggingface.co/Al-Ip) for evaluating this model. | [
-0.36846181750297546,
-0.4668126106262207,
0.16109631955623627,
0.20349259674549103,
-0.12875674664974213,
-0.10536196082830429,
-0.013128403574228287,
-0.4390326142311096,
-0.008283492177724838,
0.4924898147583008,
-0.8679866194725037,
-0.32682856917381287,
-0.8580334782600403,
0.03255536... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
casehold/casehold | casehold | 2023-10-04T19:55:29Z | 73 | 5 | null | [
"region:us"
] | 2023-10-04T19:55:29Z | 2023-03-27T23:04:36.000Z | 2023-03-27T23:04:36 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
matejklemen/falko_merlin | matejklemen | 2023-05-08T20:56:31Z | 73 | 0 | null | [
"license:cc-by-sa-4.0",
"region:us"
] | 2023-05-08T20:56:31Z | 2023-05-08T20:30:48.000Z | 2023-05-08T20:30:48 | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: src_tokens
sequence: string
- name: tgt_tokens
sequence: string
- name: corrections
list:
- name: idx_src
sequence: int32
- name: idx_tgt
sequence: int32
- name: corr_type
dtype: string
splits:
- name: train
num_bytes: 6981243
num_examples: 19237
- name: validation
num_bytes: 902510
num_examples: 2503
- name: test
num_bytes: 836757
num_examples: 2337
download_size: 85667586
dataset_size: 8720510
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jainr3/diffusiondb-pixelart | jainr3 | 2023-05-11T18:59:45Z | 73 | 8 | null | [
"task_categories:text-to-image",
"task_categories:image-to-text",
"task_ids:image-captioning",
"annotations_creators:no-annotation",
"language_creators:found",
"multilinguality:multilingual",
"size_categories:n>1T",
"source_datasets:modified",
"language:en",
"license:cc0-1.0",
"stable diffusion"... | 2023-05-11T18:59:45Z | 2023-05-11T17:28:21.000Z | 2023-05-11T17:28:21 | ---
layout: default
title: Home
nav_order: 1
has_children: false
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- cc0-1.0
multilinguality:
- multilingual
pretty_name: DiffusionDB-Pixelart
size_categories:
- n>1T
source_datasets:
- modified
tags:
- stable diffusion
- prompt engineering
- prompts
task_categories:
- text-to-image
- image-to-text
task_ids:
- image-captioning
---
# DiffusionDB-Pixelart
## Table of Contents
- [DiffusionDB](#diffusiondb)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Subset](#subset)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Dataset Metadata](#dataset-metadata)
- [Metadata Schema](#metadata-schema)
- [Data Splits](#data-splits)
- [Loading Data Subsets](#loading-data-subsets)
- [Method 1: Using Hugging Face Datasets Loader](#method-1-using-hugging-face-datasets-loader)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [DiffusionDB homepage](https://poloclub.github.io/diffusiondb)
- **Repository:** [DiffusionDB repository](https://github.com/poloclub/diffusiondb)
- **Distribution:** [DiffusionDB Hugging Face Dataset](https://huggingface.co/datasets/poloclub/diffusiondb)
- **Paper:** [DiffusionDB: A Large-scale Prompt Gallery Dataset for Text-to-Image Generative Models](https://arxiv.org/abs/2210.14896)
### Dataset Summary
**This is a subset of the DiffusionDB 2M dataset which has been turned into pixel-style art.**
DiffusionDB is the first large-scale text-to-image prompt dataset. It contains **14 million** images generated by Stable Diffusion using prompts and hyperparameters specified by real users.
DiffusionDB is publicly available at [🤗 Hugging Face Dataset](https://huggingface.co/datasets/poloclub/diffusiondb).
### Supported Tasks and Leaderboards
The unprecedented scale and diversity of this human-actuated dataset provide exciting research opportunities in understanding the interplay between prompts and generative models, detecting deepfakes, and designing human-AI interaction tools to help users more easily use these models.
### Languages
The text in the dataset is mostly English. It also contains other languages such as Spanish, Chinese, and Russian.
### Subset
DiffusionDB provides two subsets (DiffusionDB 2M and DiffusionDB Large) to support different needs. The pixelated version of the data was taken from the DiffusionDB 2M and has 2000 examples only.
|Subset|Num of Images|Num of Unique Prompts|Size|Image Directory|Metadata Table|
|:--|--:|--:|--:|--:|--:|
|DiffusionDB-pixelart|2k|~1.5k|~1.6GB|`images/`|`metadata.parquet`|
Images in DiffusionDB-pixelart are stored in `png` format.
## Dataset Structure
We use a modularized file structure to distribute DiffusionDB. The 2k images in DiffusionDB-pixelart are split into folders, where each folder contains 1,000 images and a JSON file that links these 1,000 images to their prompts and hyperparameters.
```bash
# DiffusionDB 2k
./
├── images
│ ├── part-000001
│ │ ├── 3bfcd9cf-26ea-4303-bbe1-b095853f5360.png
│ │ ├── 5f47c66c-51d4-4f2c-a872-a68518f44adb.png
│ │ ├── 66b428b9-55dc-4907-b116-55aaa887de30.png
│ │ ├── [...]
│ │ └── part-000001.json
│ ├── part-000002
│ ├── part-000003
│ ├── [...]
│ └── part-002000
└── metadata.parquet
```
These sub-folders have names `part-0xxxxx`, and each image has a unique name generated by [UUID Version 4](https://en.wikipedia.org/wiki/Universally_unique_identifier). The JSON file in a sub-folder has the same name as the sub-folder. Each image is a `PNG` file (DiffusionDB-pixelart). The JSON file contains key-value pairs mapping image filenames to their prompts and hyperparameters.
### Data Instances
For example, below is the image of `ec9b5e2c-028e-48ac-8857-a52814fd2a06.png` and its key-value pair in `part-000001.json`.
<img width="300" src="https://datasets-server.huggingface.co/assets/jainr3/diffusiondb-pixelart/--/2k_all/train/0/image/image.png">
```json
{
"ec9b5e2c-028e-48ac-8857-a52814fd2a06.png": {
"p": "doom eternal, game concept art, veins and worms, muscular, crustacean exoskeleton, chiroptera head, chiroptera ears, mecha, ferocious, fierce, hyperrealism, fine details, artstation, cgsociety, zbrush, no background ",
"se": 3312523387,
"c": 7.0,
"st": 50,
"sa": "k_euler"
},
}
```
### Data Fields
- key: Unique image name
- `p`: Text
### Dataset Metadata
To help you easily access prompts and other attributes of images without downloading all the Zip files, we include a metadata table `metadata.parquet` for DiffusionDB-pixelart.
Two tables share the same schema, and each row represents an image. We store these tables in the Parquet format because Parquet is column-based: you can efficiently query individual columns (e.g., prompts) without reading the entire table.
Below are three random rows from `metadata.parquet`.
| image_name | prompt | part_id | seed | step | cfg | sampler | width | height | user_name | timestamp | image_nsfw | prompt_nsfw |
|:-----------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------:|-----------:|-------:|------:|----------:|--------:|---------:|:-----------------------------------------------------------------|:--------------------------|-------------:|--------------:|
| 0c46f719-1679-4c64-9ba9-f181e0eae811.png | a small liquid sculpture, corvette, viscous, reflective, digital art | 1050 | 2026845913 | 50 | 7 | 8 | 512 | 512 | c2f288a2ba9df65c38386ffaaf7749106fed29311835b63d578405db9dbcafdb | 2022-08-11 09:05:00+00:00 | 0.0845108 | 0.00383462 |
| a00bdeaa-14eb-4f6c-a303-97732177eae9.png | human sculpture of lanky tall alien on a romantic date at italian restaurant with smiling woman, nice restaurant, photography, bokeh | 905 | 1183522603 | 50 | 10 | 8 | 512 | 768 | df778e253e6d32168eb22279a9776b3cde107cc82da05517dd6d114724918651 | 2022-08-19 17:55:00+00:00 | 0.692934 | 0.109437 |
| 6e5024ce-65ed-47f3-b296-edb2813e3c5b.png | portrait of barbaric spanish conquistador, symmetrical, by yoichi hatakenaka, studio ghibli and dan mumford | 286 | 1713292358 | 50 | 7 | 8 | 512 | 640 | 1c2e93cfb1430adbd956be9c690705fe295cbee7d9ac12de1953ce5e76d89906 | 2022-08-12 03:26:00+00:00 | 0.0773138 | 0.0249675 |
#### Metadata Schema
`metadata.parquet` schema:
|Column|Type|Description|
|:---|:---|:---|
|`image_name`|`string`|Image UUID filename.|
|`text`|`string`|The text prompt used to generate this image.|
> **Warning**
> Although the Stable Diffusion model has an NSFW filter that automatically blurs user-generated NSFW images, this NSFW filter is not perfect—DiffusionDB still contains some NSFW images. Therefore, we compute and provide the NSFW scores for images and prompts using the state-of-the-art models. The distribution of these scores is shown below. Please decide an appropriate NSFW score threshold to filter out NSFW images before using DiffusionDB in your projects.
<img src="https://i.imgur.com/1RiGAXL.png" width="100%">
### Data Splits
For DiffusionDB-pixelart, we split 2k images into folders where each folder contains 1,000 images and a JSON file.
### Loading Data Subsets
DiffusionDB is large! However, with our modularized file structure, you can easily load a desirable number of images and their prompts and hyperparameters. In the [`example-loading.ipynb`](https://github.com/poloclub/diffusiondb/blob/main/notebooks/example-loading.ipynb) notebook, we demonstrate three methods to load a subset of DiffusionDB. Below is a short summary.
#### Method 1: Using Hugging Face Datasets Loader
You can use the Hugging Face [`Datasets`](https://huggingface.co/docs/datasets/quickstart) library to easily load prompts and images from DiffusionDB. We pre-defined 16 DiffusionDB subsets (configurations) based on the number of instances. You can see all subsets in the [Dataset Preview](https://huggingface.co/datasets/poloclub/diffusiondb/viewer/all/train).
```python
import numpy as np
from datasets import load_dataset
# Load the dataset with the `2k_random_1k` subset
dataset = load_dataset('jainr3/diffusiondb-pixelart', '2k_random_1k')
```
## Dataset Creation
### Curation Rationale
Recent diffusion models have gained immense popularity by enabling high-quality and controllable image generation based on text prompts written in natural language. Since the release of these models, people from different domains have quickly applied them to create award-winning artworks, synthetic radiology images, and even hyper-realistic videos.
However, generating images with desired details is difficult, as it requires users to write proper prompts specifying the exact expected results. Developing such prompts requires trial and error, and can often feel random and unprincipled. Simon Willison analogizes writing prompts to wizards learning “magical spells”: users do not understand why some prompts work, but they will add these prompts to their “spell book.” For example, to generate highly-detailed images, it has become a common practice to add special keywords such as “trending on artstation” and “unreal engine” in the prompt.
Prompt engineering has become a field of study in the context of text-to-text generation, where researchers systematically investigate how to construct prompts to effectively solve different down-stream tasks. As large text-to-image models are relatively new, there is a pressing need to understand how these models react to prompts, how to write effective prompts, and how to design tools to help users generate images.
To help researchers tackle these critical challenges, we create DiffusionDB, the first large-scale prompt dataset with 14 million real prompt-image pairs.
### Source Data
#### Initial Data Collection and Normalization
We construct DiffusionDB by scraping user-generated images on the official Stable Diffusion Discord server. We choose Stable Diffusion because it is currently the only open-source large text-to-image generative model, and all generated images have a CC0 1.0 Universal Public Domain Dedication license that waives all copyright and allows uses for any purpose. We choose the official [Stable Diffusion Discord server](https://discord.gg/stablediffusion) because it is public, and it has strict rules against generating and sharing illegal, hateful, or NSFW (not suitable for work, such as sexual and violent content) images. The server also disallows users to write or share prompts with personal information.
#### Who are the source language producers?
The language producers are users of the official [Stable Diffusion Discord server](https://discord.gg/stablediffusion).
### Annotations
The dataset does not contain any additional annotations.
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
The authors removed the discord usernames from the dataset.
We decide to anonymize the dataset because some prompts might include sensitive information: explicitly linking them to their creators can cause harm to creators.
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset is to help develop better understanding of large text-to-image generative models.
The unprecedented scale and diversity of this human-actuated dataset provide exciting research opportunities in understanding the interplay between prompts and generative models, detecting deepfakes, and designing human-AI interaction tools to help users more easily use these models.
It should note that we collect images and their prompts from the Stable Diffusion Discord server. The Discord server has rules against users generating or sharing harmful or NSFW (not suitable for work, such as sexual and violent content) images. The Stable Diffusion model used in the server also has an NSFW filter that blurs the generated images if it detects NSFW content. However, it is still possible that some users had generated harmful images that were not detected by the NSFW filter or removed by the server moderators. Therefore, DiffusionDB can potentially contain these images. To mitigate the potential harm, we provide a [Google Form](https://forms.gle/GbYaSpRNYqxCafMZ9) on the [DiffusionDB website](https://poloclub.github.io/diffusiondb/) where users can report harmful or inappropriate images and prompts. We will closely monitor this form and remove reported images and prompts from DiffusionDB.
### Discussion of Biases
The 14 million images in DiffusionDB have diverse styles and categories. However, Discord can be a biased data source. Our images come from channels where early users could use a bot to use Stable Diffusion before release. As these users had started using Stable Diffusion before the model was public, we hypothesize that they are AI art enthusiasts and are likely to have experience with other text-to-image generative models. Therefore, the prompting style in DiffusionDB might not represent novice users. Similarly, the prompts in DiffusionDB might not generalize to domains that require specific knowledge, such as medical images.
### Other Known Limitations
**Generalizability.** Previous research has shown a prompt that works well on one generative model might not give the optimal result when used in other models.
Therefore, different models can need users to write different prompts. For example, many Stable Diffusion prompts use commas to separate keywords, while this pattern is less seen in prompts for DALL-E 2 or Midjourney. Thus, we caution researchers that some research findings from DiffusionDB might not be generalizable to other text-to-image generative models.
## Additional Information
### Dataset Curators
DiffusionDB is created by [Jay Wang](https://zijie.wang), [Evan Montoya](https://www.linkedin.com/in/evan-montoya-b252391b4/), [David Munechika](https://www.linkedin.com/in/dmunechika/), [Alex Yang](https://alexanderyang.me), [Ben Hoover](https://www.bhoov.com), [Polo Chau](https://faculty.cc.gatech.edu/~dchau/).
### Licensing Information
The DiffusionDB dataset is available under the [CC0 1.0 License](https://creativecommons.org/publicdomain/zero/1.0/).
The Python code in this repository is available under the [MIT License](https://github.com/poloclub/diffusiondb/blob/main/LICENSE).
### Citation Information
```bibtex
@article{wangDiffusionDBLargescalePrompt2022,
title = {{{DiffusionDB}}: {{A}} Large-Scale Prompt Gallery Dataset for Text-to-Image Generative Models},
author = {Wang, Zijie J. and Montoya, Evan and Munechika, David and Yang, Haoyang and Hoover, Benjamin and Chau, Duen Horng},
year = {2022},
journal = {arXiv:2210.14896 [cs]},
url = {https://arxiv.org/abs/2210.14896}
}
```
### Contributions
If you have any questions, feel free to [open an issue](https://github.com/poloclub/diffusiondb/issues/new) or contact the original author [Jay Wang](https://zijie.wang). | [
-0.7552851438522339,
-0.8665003776550293,
0.47276586294174194,
0.39642825722694397,
-0.15460938215255737,
0.12365695089101791,
0.1643640697002411,
-0.1962791085243225,
0.5646247267723083,
0.46335598826408386,
-0.7134152054786682,
-0.7729069590568542,
-0.6568992733955383,
0.0402874089777469... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ccmusic-database/music_genre | ccmusic-database | 2023-11-28T14:29:25Z | 73 | 8 | null | [
"task_categories:audio-classification",
"size_categories:10K<n<100K",
"language:zh",
"language:en",
"license:mit",
"music",
"art",
"region:us"
] | 2023-11-28T14:29:25Z | 2023-05-25T14:10:47.000Z | 2023-05-25T14:10:47 | ---
license: mit
task_categories:
- audio-classification
language:
- zh
- en
tags:
- music
- art
pretty_name: Music Genre Database
size_categories:
- 10K<n<100K
viewer: false
---
# Dataset Card for Music Genre Dataset
## Dataset Description
- **Homepage:** <https://ccmusic-database.github.io>
- **Repository:** <https://huggingface.co/datasets/ccmusic-database/music_genre>
- **Paper:** <https://doi.org/10.5281/zenodo.5676893>
- **Leaderboard:** <https://ccmusic-database.github.io/team.html>
- **Point of Contact:** N/A
### Dataset Summary
This database contains about 1700 musical pieces (.mp3 format) with lengths of 270-300s that are divided into 17 genres in total.
### Supported Tasks and Leaderboards
Audio classification
### Languages
Multilingual
## Usage
When doing classification task, only one colum of fst_level_label, sec_level_label and thr_level_label can be used, not for mixing.
```
from datasets import load_dataset
dataset = load_dataset("ccmusic-database/music_genre", split="test")
for item in dataset:
print(item)
```
## Dataset Structure
| mel | cqt | chroma | fst_level_label | sec_level_label | thr_level_label |
| :---: | :---: | :----: | :-------------: | :-------------: | :-------------: |
| jpg | jpg | jpg | 2-class | 9-class | 12-class |
### Data Instances
.zip(.jpg)
### Data Fields
```
0_None
1_Classic
3_Symphony
4_Opera
5_Solo
6_Chamber
2_Non_classic
7_Pop
12_Pop_vocal_ballad
13_Adult_contemporary
14_Teen_pop
8_Dance_and_house
15_Contemporary_dance_pop
16_Dance_pop
9_Indie
17_Classic_indie_pop
18_Chamber_cabaret_and_art_pop
10_Soul_or_r_and_b
11_Rock
19_Adult_alternative_rock
20_Uplifting_anthemic_rock
21_Soft_rock
22_Acoustic_pop
```
### Data Splits
Train(80%), valid(10%), test(10%)
## Dataset Creation
### Curation Rationale
Promoting the development of AI in the music industry
### Source Data
#### Initial Data Collection and Normalization
Zhaorui Liu, Monan Zhou
#### Who are the source language producers?
Composers of the songs in dataset
### Annotations
#### Annotation process
Students collected about 1700 musical pieces (.mp3 format) with lengths of 270-300s divided into 17 genres in total.
#### Who are the annotators?
Students from CCMUSIC
### Personal and Sensitive Information
Due to copyright issues with the original music, only mel spectrograms are provided in the dataset.
## Considerations for Using the Data
### Social Impact of Dataset
Promoting the development of AI in the music industry
### Discussion of Biases
Most are English songs
### Other Known Limitations
Samples are not balanced enough
## Additional Information
### Dataset Curators
Zijin Li
### Evaluation
Coming soon...
### Licensing Information
```
MIT License
Copyright (c) CCMUSIC
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
### Citation Information
```
@dataset{zhaorui_liu_2021_5676893,
author = {Zhaorui Liu, Monan Zhou, Shenyang Xu, Zhaowen Wang, Wei Li and Zijin Li},
title = {CCMUSIC DATABASE: A Music Data Sharing Platform for Computational Musicology Research},
month = {nov},
year = {2021},
publisher = {Zenodo},
version = {1.1},
doi = {10.5281/zenodo.5676893},
url = {https://doi.org/10.5281/zenodo.5676893}
}
```
### Contributions
Provide a dataset for music genre classification | [
-0.5508933067321777,
-0.37933817505836487,
0.1712888926267624,
0.4227988123893738,
-0.32865339517593384,
0.048709820955991745,
-0.6064590215682983,
-0.26768380403518677,
0.43554437160491943,
0.5478919148445129,
-1.0023213624954224,
-1.2550616264343262,
-0.2685416042804718,
0.07559589296579... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
tasksource/mtop | tasksource | 2023-06-23T12:49:07Z | 73 | 0 | null | [
"task_categories:text-classification",
"multilinguality:multilingual",
"license:cc-by-sa-4.0",
"multilingual",
"intent",
"arxiv:2008.09335",
"region:us"
] | 2023-06-23T12:49:07Z | 2023-06-23T12:35:09.000Z | 2023-06-23T12:35:09 | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
tags:
- multilingual
- intent
multilinguality:
- multilingual
---
https://arxiv.org/pdf/2008.09335.pdf
```
@article{li2020mtop,
title={MTOP: A comprehensive multilingual task-oriented semantic parsing benchmark},
author={Li, Haoran and Arora, Abhinav and Chen, Shuohui and Gupta, Anchit and Gupta, Sonal and Mehdad, Yashar},
journal={arXiv preprint arXiv:2008.09335},
year={2020}
}
``` | [
-0.21128857135772705,
-0.5697579383850098,
0.42834317684173584,
0.2507728636264801,
-0.27918970584869385,
-0.2783900797367096,
-0.23715287446975708,
-0.7125025391578674,
-0.1271055042743683,
0.5944976806640625,
-0.5927535891532898,
-0.5850717425346375,
-0.6906294822692871,
0.36980456113815... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ammarnasr/the-stack-swift-clean | ammarnasr | 2023-08-14T21:20:23Z | 73 | 0 | null | [
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:code",
"license:openrail",
"code",
"region:us"
] | 2023-08-14T21:20:23Z | 2023-07-30T13:23:42.000Z | 2023-07-30T13:23:42 | ---
license: openrail
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 3582248477.9086223
num_examples: 806789
- name: test
num_bytes: 394048264.9973618
num_examples: 88747
- name: valid
num_bytes: 3982797.09401595
num_examples: 897
download_size: 1323156008
dataset_size: 3980279540
task_categories:
- text-generation
language:
- code
tags:
- code
pretty_name: TheStack-Swift
size_categories:
- 1M<n<10M
---
## Dataset 1: TheStack - Swift - Cleaned
**Description**: This dataset is drawn from TheStack Corpus, an open-source code dataset with over 3TB of GitHub data covering 48 programming languages. We selected a small portion of this dataset to optimize smaller language models for Swift, a popular statically typed language.
**Target Language**: Swift
**Dataset Size**:
- Training: 900,000 files
- Validation: 50,000 files
- Test: 50,000 files
**Preprocessing**:
1. Selected Swift as the target language due to its popularity on GitHub.
2. Filtered out files with average line length > 100 characters, maximum line length > 1000 characters, and alphabet ratio < 25%.
3. Split files into 90% training, 5% validation, and 5% test sets.
**Tokenizer**: Byte Pair Encoding (BPE) tokenizer with tab and whitespace tokens. GPT-2 vocabulary extended with special tokens.
**Training Sequences**: Sequences constructed by joining training data text to reach a context length of 2048 tokens (1024 tokens for full fine-tuning). | [
-0.4547312557697296,
-0.4278813600540161,
0.06770535558462143,
-0.113585464656353,
-0.5033515095710754,
0.2744337320327759,
-0.1499747484922409,
-0.2974616289138794,
0.5053725242614746,
0.5894807577133179,
-0.5756897926330566,
-0.47355788946151733,
-0.4467293918132782,
-0.02951237559318542... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jxie/esol | jxie | 2023-08-04T22:25:16Z | 73 | 0 | null | [
"region:us"
] | 2023-08-04T22:25:16Z | 2023-08-04T22:25:04.000Z | 2023-08-04T22:25:04 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: float64
splits:
- name: train_0
num_bytes: 31089
num_examples: 902
- name: val_0
num_bytes: 3828
num_examples: 113
- name: test_0
num_bytes: 4016
num_examples: 113
- name: train_1
num_bytes: 31354
num_examples: 902
- name: val_1
num_bytes: 3731
num_examples: 113
- name: test_1
num_bytes: 3848
num_examples: 113
- name: train_2
num_bytes: 31095
num_examples: 902
- name: val_2
num_bytes: 3869
num_examples: 113
- name: test_2
num_bytes: 3969
num_examples: 113
download_size: 75468
dataset_size: 116799
---
# Dataset Card for "esol"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5813881158828735,
-0.31602743268013,
0.2145523875951767,
0.24889089167118073,
-0.08819841593503952,
0.136912003159523,
0.12856122851371765,
-0.3467366099357605,
0.8672113418579102,
0.6114344596862793,
-0.9663291573524475,
-0.8721596002578735,
-0.5760131478309631,
-0.1461511105298996,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
maitrang/viwiki_20230901 | maitrang | 2023-09-02T18:29:42Z | 73 | 0 | null | [
"region:us"
] | 2023-09-02T18:29:42Z | 2023-09-02T18:23:29.000Z | 2023-09-02T18:23:29 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1237663253
num_examples: 1287269
download_size: 562888266
dataset_size: 1237663253
---
# Dataset Card for "viwiki_20230901"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6856721043586731,
-0.1773686707019806,
0.13873746991157532,
0.2956811189651489,
-0.1927623152732849,
-0.19462725520133972,
0.1935664862394333,
-0.13556797802448273,
0.8816039562225342,
0.594804048538208,
-0.9545613527297974,
-0.6218705177307129,
-0.40767908096313477,
-0.0836644619703292... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
indonlp/nusatranslation_mt | indonlp | 2023-11-10T03:56:13Z | 73 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-10T03:56:13Z | 2023-09-07T12:58:44.000Z | 2023-09-07T12:58:44 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mb23/GraySpectrogram | mb23 | 2023-11-11T23:55:18Z | 73 | 0 | null | [
"size_categories:10K<n<100K",
"language:en",
"license:cc-by-sa-4.0",
"music",
"spectrogram",
"region:us"
] | 2023-11-11T23:55:18Z | 2023-10-07T05:47:09.000Z | 2023-10-07T05:47:09 | ---
license: cc-by-sa-4.0
language:
- en
tags:
- music
- spectrogram
size_categories:
- 10K<n<100K
---
# Google/MusicCapsをスペクトログラムにしたデータ。
* <font color="red">The dataset viwer of this repository is truncated, so maybe you should see <a href="https://huggingface.co/datasets/mb23/GraySpectrotram_example">this one</a> instaed.</font>
## Dataset information
<table>
<thead>
<td>画像</td>
<td>caption</td>
<td>data_idx</td>
<td>number</td>
</thead>
<tbody>
<tr>
<td>1025px × 216px</td>
<td>音楽の説明</td>
<td>どのデータから生成されたデータか</td>
<td>5秒ずつ区切ったデータのうち、何番目か</td>
</tr>
</tbody>
</table>
## How this dataset was made
* コード:https://colab.research.google.com/drive/13m792FEoXszj72viZuBtusYRUL1z6Cu2?usp=sharing
* 参考にしたKaggle Notebook : https://www.kaggle.com/code/osanseviero/musiccaps-explorer
```python
from PIL import Image
import IPython.display
import cv2
# 1. wavファイルを解析
y, sr = librosa.load("wavファイルなど")
# 2. フーリエ変換を適用して周波数成分を取得
D = librosa.amplitude_to_db(np.abs(librosa.stft(y)), ref=np.max) # librosaを用いてデータを作る
image = Image.fromarray(np.uint8(D), mode='L') # 'L'は1チャンネルのグレースケールモードを指定します
image.save('spectrogram_{}.png')
```
## Recover music(wave form) from sprctrogram
```python
im = Image.open("pngファイル")
db_ud = np.uint8(np.array(im))
amp = librosa.db_to_amplitude(db_ud)
print(amp.shape)
# (1025, 861)は20秒のwavファイルをスペクトログラムにした場合
# (1025, 431)は10秒のwavファイルをスペクトログラムにした場合
# (1025, 216)は5秒のwavファイルをスペクトログラムにした場合
y_inv = librosa.griffinlim(amp*200)
display(IPython.display.Audio(y_inv, rate=sr))
```
## Example : How to use this
* <font color="red">Subset <b>data 1300-1600</b> and <b>data 3400-3600</b> are not working now, so please get subset_name_list</n>
those were removed first</font>.
### 1 : get information about this dataset:
* copy this code~~
```python
'''
if you use GoogleColab, remove # to install packages below..
'''
#!pip install datasets
#!pip install huggingface-hub
#!huggingface-cli login
import datasets
from datasets import load_dataset
# make subset_name_list
subset_name_list = [
'data 0-200',
'data 200-600',
'data 600-1000',
'data 1000-1300',
'data 1600-2000',
'data 2000-2200',
'data 2200-2400',
'data 2400-2600',
'data 2600-2800',
'data 3000-3200',
'data 3200-3400',
'data 3600-3800',
'data 3800-4000',
'data 4000-4200',
'data 4200-4400',
'data 4400-4600',
'data 4600-4800',
'data 4800-5000',
'data 5000-5200',
'data 5200-5520'
]
# load_all_datasets
data = load_dataset("mb23/GraySpectrogram", subset_name_list[0])
for subset in subset_name_list:
# Confirm subset_list doesn't include "remove_list" datasets in the above cell.
print(subset)
new_ds = load_dataset("mb23/GraySpectrogram", subset)
new_dataset_train = datasets.concatenate_datasets([data["train"], new_ds["train"]])
new_dataset_test = datasets.concatenate_datasets([data["test"], new_ds["test"]])
# take place of data[split]
data["train"] = new_dataset_train
data["test"] = new_dataset_test
data
```
### 2 : load dataset and change to dataloader:
* You can use the code below:
* <font color="red">...but (;・∀・)I don't know whether this code works efficiently, because I haven't tried this code so far</color>
```python
import datasets
from datasets import load_dataset, DatasetDict
from torchvision import transforms
from torch.utils.data import DataLoader
# BATCH_SIZE = ???
# IMAGE_SIZE = ???
# TRAIN_SIZE = ??? # the number of training data
# TEST_SIZE = ??? # the number of test data
def load_datasets():
# Define data transforms
data_transforms = [
transforms.Resize((IMG_SIZE, IMG_SIZE)),
transforms.ToTensor(), # Scales data into [0,1]
transforms.Lambda(lambda t: (t * 2) - 1) # Scale between [-1, 1]
]
data_transform = transforms.Compose(data_transforms)
data = load_dataset("mb23/GraySpectrogram", subset_name_list[0])
for subset in subset_name_list:
# Confirm subset_list doesn't include "remove_list" datasets in the above cell.
print(subset)
new_ds = load_dataset("mb23/GraySpectrogram", subset)
new_dataset_train = datasets.concatenate_datasets([data["train"], new_ds["train"]])
new_dataset_test = datasets.concatenate_datasets([data["test"], new_ds["test"]])
# take place of data[split]
data["train"] = new_dataset_train
data["test"] = new_dataset_test
# memo:
# 特徴量上手く抽出する方法が...わからん。これは力づく。
# 本当はload_dataset()の時点で抽出したかったけど、無理そう
# リポジトリ作り直してpush_to_hub()したほうがいいかもしれない。
new_dataset = dict()
new_dataset["train"] = Dataset.from_dict({
"image" : data["train"]["image"],
"caption" : data["train"]["caption"]
})
new_dataset["test"] = Dataset.from_dict({
"image" : data["test"]["image"],
"caption" : data["test"]["caption"]
})
data = datasets.DatasetDict(new_dataset)
train = data["train"]
test = data["test"]
for idx in range(len(train["image"])):
train["image"][idx] = data_transform(train["image"][idx])
test["image"][idx] = data_transform(test["image"][idx])
train = Dataset.from_dict(train)
train = train.with_format("torch") # リスト型回避
test = Dataset.from_dict(train)
test = test.with_format("torch") # リスト型回避
# or
train_loader = DataLoader(train, batch_size=BATCH_SIZE, shuffle=True, drop_last=True)
test_loader = DataLoader(test, batch_size=BATCH_SIZE, shuffle=True, drop_last=True)
return train_loader, test_loader
```
* then try this?
```
train_loader, test_loader = load_datasets()
```
| [
-0.504479169845581,
-0.29881200194358826,
0.07507500797510147,
0.28293177485466003,
-0.28362658619880676,
-0.07092409580945969,
-0.3885873258113861,
-0.18567277491092682,
0.38764631748199463,
0.21858438849449158,
-0.7066206932067871,
-0.5187947154045105,
-0.4248407483100891,
0.257956534624... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_stabilityai__StableBeluga2 | open-llm-leaderboard | 2023-10-15T10:41:15Z | 73 | 0 | null | [
"region:us"
] | 2023-10-15T10:41:15Z | 2023-10-15T10:41:07.000Z | 2023-10-15T10:41:07 | ---
pretty_name: Evaluation run of stabilityai/StableBeluga2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [stabilityai/StableBeluga2](https://huggingface.co/stabilityai/StableBeluga2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__StableBeluga2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T10:41:03.838240](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga2/blob/main/results_2023-10-15T10-41-03.838240.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4326761744966443,\n\
\ \"em_stderr\": 0.005073838660621812,\n \"f1\": 0.5027527265100691,\n\
\ \"f1_stderr\": 0.0048086605803724005,\n \"acc\": 0.5940617757706712,\n\
\ \"acc_stderr\": 0.01188966924347996\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4326761744966443,\n \"em_stderr\": 0.005073838660621812,\n\
\ \"f1\": 0.5027527265100691,\n \"f1_stderr\": 0.0048086605803724005\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35860500379075055,\n \
\ \"acc_stderr\": 0.013210317364134026\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825897\n\
\ }\n}\n```"
repo_url: https://huggingface.co/stabilityai/StableBeluga2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T10_41_03.838240
path:
- '**/details_harness|drop|3_2023-10-15T10-41-03.838240.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T10-41-03.838240.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T10_41_03.838240
path:
- '**/details_harness|gsm8k|5_2023-10-15T10-41-03.838240.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T10-41-03.838240.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T10_41_03.838240
path:
- '**/details_harness|winogrande|5_2023-10-15T10-41-03.838240.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T10-41-03.838240.parquet'
- config_name: results
data_files:
- split: 2023_10_15T10_41_03.838240
path:
- results_2023-10-15T10-41-03.838240.parquet
- split: latest
path:
- results_2023-10-15T10-41-03.838240.parquet
---
# Dataset Card for Evaluation run of stabilityai/StableBeluga2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/StableBeluga2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/StableBeluga2](https://huggingface.co/stabilityai/StableBeluga2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__StableBeluga2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T10:41:03.838240](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga2/blob/main/results_2023-10-15T10-41-03.838240.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4326761744966443,
"em_stderr": 0.005073838660621812,
"f1": 0.5027527265100691,
"f1_stderr": 0.0048086605803724005,
"acc": 0.5940617757706712,
"acc_stderr": 0.01188966924347996
},
"harness|drop|3": {
"em": 0.4326761744966443,
"em_stderr": 0.005073838660621812,
"f1": 0.5027527265100691,
"f1_stderr": 0.0048086605803724005
},
"harness|gsm8k|5": {
"acc": 0.35860500379075055,
"acc_stderr": 0.013210317364134026
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825897
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.4139057695865631,
-0.6629728674888611,
0.10993623733520508,
0.3199988007545471,
-0.21332459151744843,
0.058675166219472885,
-0.40754714608192444,
-0.2503765821456909,
0.3712984025478363,
0.4637879729270935,
-0.6524335741996765,
-0.8575800657272339,
-0.7116678357124329,
0.074627168476581... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MajdTannous/Dataset2 | MajdTannous | 2023-10-21T08:29:39Z | 73 | 0 | squad | [
"task_categories:question-answering",
"task_ids:extractive-qa",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:extended|wikipedia",
"language:en",
"license:cc-by-4.0",
... | 2023-10-21T08:29:39Z | 2023-10-20T12:35:12.000Z | 2023-10-20T12:35:12 | ---
pretty_name: SQuAD
viewer: true
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: squad
train-eval-index:
- config: plain_text
task: question-answering
task_id: extractive_question_answering
splits:
train_split: train
eval_split: validation
col_mapping:
question: question
context: context
answers:
text: text
answer_start: answer_start
metrics:
- type: squad
name: SQuAD
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
config_name: plain_text
splits:
- name: train
num_bytes: 79317110
num_examples: 87599
- name: validation
num_bytes: 10472653
num_examples: 10570
download_size: 35142551
dataset_size: 89789763
---
# Dataset Card for "squad"
## Table of Contents
- [Dataset Card for "squad"](#dataset-card-for-squad)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [plain_text](#plain_text)
- [Data Fields](#data-fields)
- [plain_text](#plain_text-1)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://rajpurkar.github.io/SQuAD-explorer/](https://rajpurkar.github.io/SQuAD-explorer/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 35.14 MB
- **Size of the generated dataset:** 89.92 MB
- **Total amount of disk used:** 125.06 MB
### Dataset Summary
Stanford Question Answering Dataset (SQuAD) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### plain_text
- **Size of downloaded dataset files:** 35.14 MB
- **Size of the generated dataset:** 89.92 MB
- **Total amount of disk used:** 125.06 MB
An example of 'train' looks as follows.
```
{
"answers": {
"answer_start": [1],
"text": ["This is a test text"]
},
"context": "This is a test context.",
"id": "1",
"question": "Is this a test?",
"title": "train test"
}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name |train|validation|
|----------|----:|---------:|
|plain_text|87599| 10570|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{2016arXiv160605250R,
author = {{Rajpurkar}, Pranav and {Zhang}, Jian and {Lopyrev},
Konstantin and {Liang}, Percy},
title = "{SQuAD: 100,000+ Questions for Machine Comprehension of Text}",
journal = {arXiv e-prints},
year = 2016,
eid = {arXiv:1606.05250},
pages = {arXiv:1606.05250},
archivePrefix = {arXiv},
eprint = {1606.05250},
}
``` | [
-0.6402162313461304,
-0.6525070071220398,
0.06994582712650299,
0.19016870856285095,
-0.11505743116140366,
0.12000777572393417,
-0.2764602303504944,
-0.38798782229423523,
0.5687065720558167,
0.3815827965736389,
-1.029456615447998,
-0.8777793645858765,
-0.39841917157173157,
0.233851239085197... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jxm/dbpedia | jxm | 2023-10-25T17:44:20Z | 73 | 0 | null | [
"region:us"
] | 2023-10-25T17:44:20Z | 2023-10-25T17:44:13.000Z | 2023-10-25T17:44:13 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 14782633
num_examples: 49999
- name: test
num_bytes: 20641120
num_examples: 70000
- name: dev
num_bytes: 74007
num_examples: 256
download_size: 21721890
dataset_size: 35497760
---
# Dataset Card for "dbpedia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.7255640625953674,
-0.31716740131378174,
0.21683214604854584,
0.19700337946414948,
-0.15026983618736267,
-0.09515126049518585,
0.14091208577156067,
-0.23096847534179688,
0.9280035495758057,
0.4209236204624176,
-0.9911530613899231,
-0.7509352564811707,
-0.23996998369693756,
-0.18907488882... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nateraw/rap-lyrics-v1 | nateraw | 2023-11-06T09:07:02Z | 73 | 0 | null | [
"region:us"
] | 2023-11-06T09:07:02Z | 2023-11-06T09:07:00.000Z | 2023-11-06T09:07:00 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: artist
dtype: string
- name: title
dtype: string
- name: full_title
dtype: string
- name: lyrics
dtype: string
splits:
- name: train
num_bytes: 7948557
num_examples: 2350
download_size: 4158696
dataset_size: 7948557
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rap-lyrics-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5733618140220642,
-0.09021757543087006,
0.04197729006409645,
0.5484539270401001,
-0.2213486135005951,
0.15131942927837372,
0.2942140996456146,
-0.17428654432296753,
1.0599994659423828,
0.48666539788246155,
-1.0776069164276123,
-0.9282078146934509,
-0.8148462176322937,
-0.235111653804779... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
SalomonMetre13/nnd_fr_26k | SalomonMetre13 | 2023-11-20T09:08:05Z | 73 | 0 | null | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:nnd",
"license:mit",
"region:us"
] | 2023-11-20T09:08:05Z | 2023-11-12T15:40:13.000Z | 2023-11-12T15:40:13 | ---
license: mit
language:
- nnd
task_categories:
- translation
size_categories:
- 10K<n<100K
---
This <span style="color:teal;">parallel corpus </span> contains <span style="color:teal;">26240</span> aligned sentence pairs <span style="color:teal;">Nande-French</span> in a <span style="color:teal;">90:10</span> split for the train and the test sets. It has been mainly used to fine-tune the <span style="color:teal;"> t5-base </span> pretrained model for the development of <a href="https://huggingface.co/SalomonMetre13/nnd_fr_mt_v3" style="color:green;">this translation model </a> | [
-0.5006879568099976,
-0.6172949075698853,
0.41607093811035156,
0.747218132019043,
-0.3860440254211426,
0.26078352332115173,
-0.28283822536468506,
-0.2233787477016449,
0.3280188739299774,
0.01757332682609558,
-0.5903363823890686,
-0.5408064126968384,
-0.6344025731086731,
0.29705291986465454... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jhflow/share_gpt_translation_with_instruction | jhflow | 2023-11-13T10:08:14Z | 73 | 0 | null | [
"region:us"
] | 2023-11-13T10:08:14Z | 2023-11-13T10:03:25.000Z | 2023-11-13T10:03:25 | original dataset = [squarelike/sharegpt_deepl_ko_translation](https://huggingface.co/datasets/squarelike/sharegpt_deepl_ko_translation)
| [
-0.34005799889564514,
-0.6252004504203796,
0.19229553639888763,
0.6201967597007751,
-0.3760407865047455,
-0.2900170683860779,
-0.12330459803342819,
-0.05790678411722183,
0.7486727237701416,
0.7583656311035156,
-0.9654009938240051,
-0.545487642288208,
-0.6322172284126282,
0.0018299137009307... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
amaai-lab/MusicBench | amaai-lab | 2023-11-20T14:54:56Z | 73 | 7 | null | [
"license:cc-by-sa-3.0",
"arxiv:2311.08355",
"region:us"
] | 2023-11-20T14:54:56Z | 2023-11-15T03:07:56.000Z | 2023-11-15T03:07:56 |
---
license: cc-by-sa-3.0
---
# MusicBench Dataset
The MusicBench dataset is a music audio-text pair dataset that was designed for text-to-music generation purpose and released along with Mustango text-to-music model. MusicBench is based on the MusicCaps dataset, which it expands from 5,521 samples to 52,768 training and 400 test samples!
## Dataset Details
MusicBench expands MusicCaps by:
1. Including music features of chords, beats, tempo, and key that are extracted from the audio.
2. Describing these music features using text templates and thus enhancing the original text prompts.
3. Expanding the number of audio samples by performing musically meaningful augmentations: semitone pitch shifts, tempo changes, and volume changes.
Train set size = 52,768 samples
Test set size = 400
### Dataset Description
MusicBench consists of 3 .json files and attached audio files in .tar.gz form.
The train set contains audio augmented samples and enhanced captions. Additionally, it offers ChatGPT rephrased captions for all the audio samples.
Both TestA and TestB sets contain the same audio content, but TestB has all 4 possible control sentences (related to 4 music features) in captions of all samples, while TestA has no control sentences in the captions.
For more details, see Figure 1 in our paper.
Each row of a .json file has:
1. **location** (of the files after decompressing the .tar.gz file)
2. **main_caption** – text prompts that are a result of augmentation (TestB contains control sentences, train set contains ChatGPT rephrased captions here)
3. **alt_caption** – in the case of TestB these are captions without any control sentences added.
4. prompt_aug – A control sentence related to volume change augmentation.
5. prompt_ch – A control sentence describing the chord sequence.
6. prompt_bt – A control sentence describing the beat count (meter)
7. prompt_bpm – A control sentence describing tempo, either in beats per minute (bpm), or in musical words, e.g., Adagio, Moderato, Presto.
8. prompt_key – A control sentence related to the extracted musical key.
9. **beats** – The beat and downbeat timestamps. This is used as an input for training Mustango.
10. bpm – The tempo feature saved as a number.
11. **chords** – The chord sequence contained in the track. This is used as an input for training Mustango.
12. **chords_time** – Timestamps of the detected chords. This is used as an input for training Mustango.
13. key – The root and the type of the detected key.
14. keyprob – The confidence score for this detected key provided by the detection algorithm.
# FMACaps Evaluation Dataset
Hereby, we also present you the FMACaps evaluation dataset which consists of 1000 samples extracted from the Free Music Archive (FMA) and pseudocaptioned through extracting tags from audio and then utilizing ChatGPT in-context learning. More information is available in our paper!
Most of the samples are 10 second long, exceptions are between 5 to 10 seconds long.
Data size: 1,000 samples
Sampling rate: 16 kHz
Files included:
1. 1,000 audio files in the "audiodata" folder
2. FMACaps_A – this file contains captions with NO control sentences.
3. FMACaps_B – this file contains captions with ALL control sentences. We used this file the our controllability evaluation of Mustango.
4. FMACaps_C – this file contains captions with SOME controls sentences. For each sample, we chose 0/1/2/3/4 control sentences with a probability of 25/30/20/15/10 %, as described in our paper. This file was used to objectively evaluate audio quality of Mustango.
The structure of each .json file is identical to MusicBench, as described in the previous section, with the exception of "alt_caption" column being empty. **All captions** are in the **"main_caption" column**!
## Links
- **Code Repository:** [https://github.com/AMAAI-Lab/mustango]
- **Paper:** [https://arxiv.org/abs/2311.08355]
- **Demo:** [https://replicate.com/declare-lab/mustango]
- **Website:** [https://amaai-lab.github.io/mustango/]
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```bibtex
@misc{melechovsky2023mustango,
title={Mustango: Toward Controllable Text-to-Music Generation},
author={Jan Melechovsky and Zixun Guo and Deepanway Ghosal and Navonil Majumder and Dorien Herremans and Soujanya Poria},
year={2023},
eprint={2311.08355},
archivePrefix={arXiv}
}
```
**License:** cc-by-sa-3.0 | [
-0.6722215414047241,
-0.4954882860183716,
0.14986051619052887,
0.48175376653671265,
-0.2539946734905243,
0.16901083290576935,
-0.4571700692176819,
-0.41095614433288574,
0.4006999731063843,
0.44528791308403015,
-0.972670316696167,
-0.6769918203353882,
-0.22017888724803925,
0.105385795235633... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
iambestfeed/preproced_baomoi | iambestfeed | 2023-11-16T10:08:38Z | 73 | 0 | null | [
"region:us"
] | 2023-11-16T10:08:38Z | 2023-11-16T09:38:18.000Z | 2023-11-16T09:38:18 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Santp98/sentences_triplets_secop2_splits | Santp98 | 2023-11-18T17:14:05Z | 73 | 0 | null | [
"region:us"
] | 2023-11-18T17:14:05Z | 2023-11-18T17:13:17.000Z | 2023-11-18T17:13:17 | ---
dataset_info:
features:
- name: segment_code_pos
dtype: string
- name: segment_code_neg
dtype: string
- name: anchor_sent
dtype: string
- name: positive_sent
dtype: string
- name: negative_sent
dtype: string
splits:
- name: train
num_bytes: 389514845.59367234
num_examples: 552087
- name: test
num_bytes: 83467920.46898298
num_examples: 118305
- name: validation
num_bytes: 83467214.93734469
num_examples: 118304
download_size: 313920558
dataset_size: 556449981.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nferruz/UR50_2021_04 | nferruz | 2022-07-22T13:44:04Z | 72 | 1 | null | [
"size_categories:unknown",
"region:us"
] | 2022-07-22T13:44:04Z | 2022-03-02T23:29:22.000Z | 2022-03-02T23:29:22 | ---
YAML tags:
annotations_creators: []
language_creators: []
language: []
license: []
multilinguality: []
pretty_name: ''
size_categories:
- unknown
source_datasets: []
task_categories: []
task_ids: []
---
# Dataset Card for UR50_2021_04
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
https://ftp.uniprot.org/pub/databases/uniprot/uniref/uniref50/
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.uniprot.org/
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The Uniref50 (UR50) dataset version 2021/04 is a biological dataset taken from the Uniprot database: https://www.uniprot.org/
### Supported Tasks and Leaderboards
The UR50 dataset contains 48 Million protein sequences. It is a useful dataset to train protein language models.
### Languages
Proteins
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
Train, validation
## Dataset Creation
### Curation Rationale
Substituted FASTA headers by <endoftext> tag.
The dataset was tokenized using BPE and further split into train and validation datasets (ratio 90/10) choosing random sequences for the latter.
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
UniProt
### Annotations
#### Annotation process
UniProt contains annotations but no labels/annotations were used for this dataset.
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Citation Information
### Contributions
Thanks to UniProt for curating this dataset. https://www.uniprot.org/
| [
-0.3715476095676422,
-0.27054738998413086,
-0.04016074910759926,
0.3350169360637665,
-0.303460031747818,
0.15571565926074982,
-0.2187701016664505,
-0.24312585592269897,
0.3448774516582489,
0.45969244837760925,
-0.6956002116203308,
-0.9935675263404846,
-0.4621690809726715,
0.406290620565414... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lmqg/qg_itquad | lmqg | 2022-12-02T18:54:31Z | 72 | 1 | null | [
"task_categories:text-generation",
"task_ids:language-modeling",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:squad_es",
"language:it",
"license:cc-by-4.0",
"question-generation",
"arxiv:2210.03992",
"region:us"
] | 2022-12-02T18:54:31Z | 2022-06-02T23:45:12.000Z | 2022-06-02T23:45:12 | ---
license: cc-by-4.0
pretty_name: SQuAD-it for question generation
language: it
multilinguality: monolingual
size_categories: 10K<n<100K
source_datasets: squad_es
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qg_itquad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is a subset of [QG-Bench](https://github.com/asahi417/lm-question-generation/blob/master/QG_BENCH.md#datasets), a unified question generation benchmark proposed in
["Generative Language Models for Paragraph-Level Question Generation: A Unified Benchmark and Evaluation, EMNLP 2022 main conference"](https://arxiv.org/abs/2210.03992).
This is a modified version of [SQuAD-it](https://huggingface.co/datasets/squad_it) for question generation (QG) task.
Since the original dataset only contains training/validation set, we manually sample test set from training set, which
has no overlap in terms of the paragraph with the training set.
### Supported Tasks and Leaderboards
* `question-generation`: The dataset is assumed to be used to train a model for question generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
Italian (it)
## Dataset Structure
An example of 'train' looks as follows.
```
{
'answer': 'Carlo III',
'question': "Il figlio di chi è morto sulla strada per Palermo e vi è sepolto?",
'sentence': 'Carlo III scelse Palermo per la sua incoronazione come Re di Sicilia.',
'paragraph': 'Dopo il trattato di Utrecht (1713), la Sicilia fu consegnata ai Savoia, ma nel 1734 fu nuovamente posseduta dai...',
'sentence_answer': '<hl> Carlo III <hl> scelse Palermo per la sua incoronazione come Re di Sicilia.',
'paragraph_answer': "Dopo il trattato di Utrecht (1713), la Sicilia fu consegnata ai Savoia, ma nel 1734 fu nuovamente posseduta dai borbonici. <hl> Carlo III <hl> scelse Palermo per la sua incoronazione come Re di Sicilia. Charles fece costruire nuove case per la popolazione in crescita, mentre il commercio e l' industria crebbero. Tuttavia, ormai Palermo era ora solo un' altra città provinciale, dato che la Corte Reale risiedeva a Napoli. Il figlio di Carlo Ferdinando, anche se non gradito dalla popolazione, si rifugiò a Palermo dopo la Rivoluzione francese del 1798. Suo figlio Alberto è morto sulla strada per Palermo ed è sepolto in città. Quando fu fondato il Regno delle Due Sicilie, la capitale originaria era Palermo (1816) ma un anno dopo si trasferì a Napoli.",
'paragraph_sentence': "Dopo il trattato di Utrecht (1713), la Sicilia fu consegnata ai Savoia, ma nel 1734 fu nuovamente posseduta dai borbonici. <hl> Carlo III scelse Palermo per la sua incoronazione come Re di Sicilia. <hl> Charles fece costruire nuove case per la popolazione in crescita, mentre il commercio e l' industria crebbero. Tuttavia, ormai Palermo era ora solo un' altra città provinciale, dato che la Corte Reale risiedeva a Napoli. Il figlio di Carlo Ferdinando, anche se non gradito dalla popolazione, si rifugiò a Palermo dopo la Rivoluzione francese del 1798. Suo figlio Alberto è morto sulla strada per Palermo ed è sepolto in città. Quando fu fondato il Regno delle Due Sicilie, la capitale originaria era Palermo (1816) ma un anno dopo si trasferì a Napoli."
}
```
The data fields are the same among all splits.
- `question`: a `string` feature.
- `paragraph`: a `string` feature.
- `answer`: a `string` feature.
- `sentence`: a `string` feature.
- `paragraph_answer`: a `string` feature, which is same as the paragraph but the answer is highlighted by a special token `<hl>`.
- `paragraph_sentence`: a `string` feature, which is same as the paragraph but a sentence containing the answer is highlighted by a special token `<hl>`.
- `sentence_answer`: a `string` feature, which is same as the sentence but the answer is highlighted by a special token `<hl>`.
Each of `paragraph_answer`, `paragraph_sentence`, and `sentence_answer` feature is assumed to be used to train a question generation model,
but with different information. The `paragraph_answer` and `sentence_answer` features are for answer-aware question generation and
`paragraph_sentence` feature is for sentence-aware question generation.
## Data Splits
|train|validation|test |
|----:|---------:|----:|
|46550| 7609 |7609|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` | [
-0.5552982687950134,
-0.8225770592689514,
0.4288827180862427,
0.2951846122741699,
-0.152280792593956,
-0.21321450173854828,
-0.1258765608072281,
-0.12410841137170792,
0.17017482221126556,
0.43111059069633484,
-0.7463076710700989,
-0.5223461985588074,
-0.002470244886353612,
0.19393286108970... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
PiC/phrase_retrieval | PiC | 2023-01-20T16:32:55Z | 72 | 5 | phrase-in-context | [
"task_categories:text-retrieval",
"annotations_creators:expert-generated",
"language_creators:found",
"language_creators:expert-generated",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:original",
"language:en",
"license:cc-by-nc-4.0",
"region:us"
] | 2023-01-20T16:32:55Z | 2022-06-13T20:58:56.000Z | 2022-06-13T20:58:56 | ---
annotations_creators:
- expert-generated
language_creators:
- found
- expert-generated
language:
- en
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
paperswithcode_id: phrase-in-context
pretty_name: 'PiC: Phrase Retrieval'
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-retrieval
task_ids: []
---
# Dataset Card for "PiC: Phrase Retrieval"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://phrase-in-context.github.io/](https://phrase-in-context.github.io/)
- **Repository:** [https://github.com/phrase-in-context](https://github.com/phrase-in-context)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [Thang Pham](<thangpham@auburn.edu>)
### Dataset Summary
PR is a phrase retrieval task with the goal of finding a phrase **t** in a given document **d** such that **t** is semantically similar to the query phrase, which is the paraphrase **q**<sub>1</sub> provided by annotators.
We release two versions of PR: **PR-pass** and **PR-page**, i.e., datasets of 3-tuples (query **q**<sub>1</sub>, target phrase **t**, document **d**) where **d** is a random 11-sentence passage that contains **t** or an entire Wikipedia page.
While PR-pass contains 28,147 examples, PR-page contains slightly fewer examples (28,098) as we remove those trivial examples whose Wikipedia pages contain exactly the query phrase (in addition to the target phrase).
Both datasets are split into 5K/3K/~20K for test/dev/train, respectively.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English.
## Dataset Structure
### Data Instances
**PR-pass**
* Size of downloaded dataset files: 43.61 MB
* Size of the generated dataset: 36.98 MB
* Total amount of disk used: 80.59 MB
An example of 'train' looks as follows.
```
{
"id": "3478-1",
"title": "https://en.wikipedia.org/wiki?curid=181261",
"context": "The 425t was a 'pizza box' design with a single network expansion slot. The 433s was a desk-side server systems with multiple expansion slots. Compatibility. PC compatibility was possible either through software emulation, using the optional product DPCE, or through a plug-in card carrying an Intel 80286 processor. A third-party plug-in card with a 386 was also available. An Apollo Token Ring network card could also be placed in a standard PC and network drivers allowed it to connect to a server running a PC SMB (Server Message Block) file server. Usage. Although Apollo systems were easy to use and administer, they became less cost-effective because the proprietary operating system made software more expensive than Unix software. The 68K processors were slower than the new RISC chips from Sun and Hewlett-Packard. Apollo addressed both problems by introducing the RISC-based DN10000 and Unix-friendly Domain/OS operating system. However, the DN10000, though fast, was extremely expensive, and a reliable version of Domain/OS came too late to make a difference.",
"query": "dependable adaptation",
"answers": {
"text": ["reliable version"],
"answer_start": [1006]
}
}
```
**PR-page**
* Size of downloaded dataset files: 421.56 MB
* Size of the generated dataset: 412.17 MB
* Total amount of disk used: 833.73 MB
An example of 'train' looks as follows.
```
{
"id": "5961-2",
"title": "https://en.wikipedia.org/wiki?curid=354711",
"context": "Joseph Locke FRSA (9 August 1805 – 18 September 1860) was a notable English civil engineer of the nineteenth century, particularly associated with railway projects. Locke ranked alongside Robert Stephenson and Isambard Kingdom Brunel as one of the major pioneers of railway development. Early life and career. Locke was born in Attercliffe, Sheffield in Yorkshire, moving to nearby Barnsley when he was five. By the age of 17, Joseph had already served an apprenticeship under William Stobart at Pelaw, on the south bank of the Tyne, and under his own father, William. He was an experienced mining engineer, able to survey, sink shafts, to construct railways, tunnels and stationary engines. Joseph's father had been a manager at Wallbottle colliery on Tyneside when George Stephenson was a fireman there. In 1823, when Joseph was 17, Stephenson was involved with planning the Stockton and Darlington Railway. He and his son Robert Stephenson visited William Locke and his son at Barnsley and it was arranged that Joseph would go to work for the Stephensons. The Stephensons established a locomotive works near Forth Street, Newcastle upon Tyne, to manufacture locomotives for the new railway. Joseph Locke, despite his youth, soon established a position of authority. He and Robert Stephenson became close friends, but their friendship was interrupted, in 1824, by Robert leaving to work in Colombia for three years. Liverpool and Manchester Railway. George Stephenson carried out the original survey of the line of the Liverpool and Manchester Railway, but this was found to be flawed, and the line was re-surveyed by a talented young engineer, Charles Vignoles. Joseph Locke was asked by the directors to carry out another survey of the proposed tunnel works and produce a report. The report was highly critical of the work already done, which reflected badly on Stephenson. Stephenson was furious and henceforth relations between the two men were strained, although Locke continued to be employed by Stephenson, probably because the latter recognised his worth. Despite the many criticisms of Stephenson's work, when the bill for the new line was finally passed, in 1826, Stephenson was appointed as engineer and he appointed Joseph Locke as his assistant to work alongside Vignoles, who was the other assistant. However, a clash of personalities between Stephenson and Vignoles led to the latter resigning, leaving Locke as the sole assistant engineer. Locke took over responsibility for the western half of the line. One of the major obstacles to be overcome was Chat Moss, a large bog that had to be crossed. Although, Stephenson usually gets the credit for this feat, it is believed that it was Locke who suggested the correct method for crossing the bog. Whilst the line was being built, the directors were trying to decide whether to use standing engines or locomotives to propel the trains. Robert Stephenson and Joseph Locke were convinced that locomotives were vastly superior, and in March 1829 the two men wrote a report demonstrating the superiority of locomotives when used on a busy railway. The report led to the decision by the directors to hold an open trial to find the best locomotive. This was the Rainhill Trials, which were run in October 1829, and were won by \"Rocket\". When the line was finally opened in 1830, it was planned for a procession of eight trains to travel from Liverpool to Manchester and back. George Stephenson drove the leading locomotive \"Northumbrian\" and Joseph Locke drove \"Rocket\". The day was marred by the death of William Huskisson, the Member of Parliament for Liverpool, who was struck and killed by \"Rocket\". Grand Junction Railway. In 1829 Locke was George Stephenson's assistant, given the job of surveying the route for the Grand Junction Railway. This new railway was to join Newton-le-Willows on the Liverpool and Manchester Railway with Warrington and then on to Birmingham via Crewe, Stafford and Wolverhampton, a total of 80 miles. Locke is credited with choosing the location for Crewe and recommending the establishment there of shops required for the building and repairs of carriages and wagons as well as engines. During the construction of the Liverpool and Manchester Railway, Stephenson had shown a lack of ability in organising major civil engineering projects. On the other hand, Locke's ability to manage complex projects was well known. The directors of the new railway decided on a compromise whereby Locke was made responsible for the northern half of the line and Stephenson was made responsible for the southern half. However Stephenson's administrative inefficiency soon became apparent, whereas Locke estimated the costs for his section of the line so meticulously and speedily, that he had all of the contracts signed for his section of the line before a single one had been signed for Stephenson's section. The railway company lost patience with Stephenson, but tried to compromise by making both men joint-engineers. Stephenson's pride would not let him accept this, and so he resigned from the project. By autumn of 1835 Locke had become chief engineer for the whole of the line. This caused a rift between the two men, and strained relations between Locke and Robert Stephenson. Up to this point, Locke had always been under George Stephenson's shadow. From then on, he would be his own man, and stand or fall by his own achievements. The line was opened on 4 July 1837. New methods. Locke's route avoided as far as possible major civil engineering works. The main one was the Dutton Viaduct which crosses the River Weaver and the Weaver Navigation between the villages of Dutton and Acton Bridge in Cheshire. The viaduct consists of 20 arches with spans of 20 yards. An important feature of the new railway was the use of double-headed (dumb-bell) wrought-iron rail supported on timber sleepers at 2 ft 6 in intervals. It was intended that when the rails became worn they could be turned over to use the other surface, but in practice it was found that the chairs into which the rails were keyed caused wear to the bottom surface so that it became uneven. However this was still an improvement on the fish-bellied, wrought-iron rails still being used by Robert Stephenson on the London and Birmingham Railway. Locke was more careful than Stephenson to get value for his employers' money. For the Penkridge Viaduct Stephenson had obtained a tender of £26,000. After Locke took over, he gave the potential contractor better information and agreed a price of only £6,000. Locke also tried to avoid tunnels because in those days tunnels often took longer and cost more than planned. The Stephensons regarded 1 in 330 as the maximum slope that an engine could manage and Robert Stephenson achieved this on the London and Birmingham Railway by using seven tunnels which added both cost and delay. Locke avoided tunnels almost completely on the Grand Junction but exceeded the slope limit for six miles south of Crewe. Proof of Locke's ability to estimate costs accurately is given by the fact that the construction of the Grand Junction line cost £18,846 per mile as against Locke's estimate of £17,000. This is amazingly accurate compared with the estimated costs for the London and Birmingham Railway (Robert Stephenson) and the Great Western Railway (Brunel). Locke also divided the project into a few large sections rather than many small ones. This allowed him to work closely with his contractors to develop the best methods, overcome problems and personally gain practical experience of the building process and of the contractors themselves. He used the contractors who worked well with him, especially Thomas Brassey and William Mackenzie, on many other projects. Everyone gained from this cooperative approach whereas Brunel's more adversarial approach eventually made it hard for him to get anyone to work for him. Marriage. In 1834 Locke married Phoebe McCreery, with whom he adopted a child. He was elected to the Royal Society in 1838. Lancaster and Carlisle Railway. A significant difference in philosophy between George Stephenson and Joseph Locke and the surveying methods they employed was more than a mere difference of opinion. Stephenson had started his career at a time when locomotives had little power to overcome excessive gradients. Both George and Robert Stephenson were prepared to go to great lengths to avoid steep gradients that would tax the locomotives of the day, even if this meant choosing a circuitous path that added on extra miles to the line of the route. Locke had more confidence in the ability of modern locomotives to climb these gradients. An example of this was the Lancaster and Carlisle Railway, which had to cope with the barrier of the Lake District mountains. In 1839 Stephenson proposed a circuitous route that avoided the Lake District altogether by going all the way round Morecambe Bay and West Cumberland, claiming: 'This is the only practicable line from Liverpool to Carlisle. The making of a railway across Shap Fell is out of the question.' The directors rejected his route and chose the one proposed by Joseph Locke, one that used steep gradients and passed over Shap Fell. The line was completed by Locke and was a success. Locke's reasoned that by avoiding long routes and tunnelling, the line could be finished more quickly, with less capital costs, and could start earning revenue sooner. This became known as the 'up and over' school of engineering (referred to by Rolt as 'Up and Down,' or Rollercoaster). Locke took a similar approach in planning the Caledonian Railway, from Carlisle to Glasgow. In both railways he introduced gradients of 1 in 75, which severely taxed fully laden locomotives, for even as more powerful locomotives were introduced, the trains that they pulled became heavier. It may therefore be argued that Locke, although his philosophy carried the day, was not entirely correct in his reasoning. Even today, Shap Fell is a severe test of any locomotive. Manchester and Sheffield Railway. Locke was subsequently appointed to build a railway line from Manchester to Sheffield, replacing Charles Vignoles as chief engineer, after the latter had been beset by misfortunes and financial difficulties. The project included the three-mile Woodhead Tunnel, and the line opened, after many delays, on 23 December 1845. The building of the line required over a thousand navvies and cost the lives of thirty-two of them, seriously injuring 140 others. The Woodhead Tunnel was such a difficult undertaking that George Stephenson claimed that it could not be done, declaring that he would eat the first locomotive that got through the tunnel. Subsequent commissions. In the north, Locke also designed the Lancaster and Preston Junction Railway; the Glasgow, Paisley and Greenock Railway; and the Caledonian Railway from Carlisle to Glasgow and Edinburgh. In the south, he worked on the London and Southampton Railway, later called the London and South Western Railway, designing, among other structures, Nine Elms to Waterloo Viaduct, Richmond Railway Bridge (1848, since replaced), and Barnes Bridge (1849), both across the River Thames, tunnels at Micheldever, and the 12-arch Quay Street viaduct and the 16-arch Cams Hill viaduct, both in Fareham (1848). He was actively involved in planning and building many railways in Europe (assisted by John Milroy), including the Le Havre, Rouen, Paris rail link, the Barcelona to Mataró line and the Dutch Rhenish Railway. He was present in Paris when the Versailles train crash occurred in 1842, and produced a statement concerning the facts for General Charles Pasley of the Railway Inspectorate. He also experienced a catastrophic failure of one of his viaducts built on the new Paris-Le Havre link. . The viaduct was of stone and brick at Barentin near Rouen, and was the longest and highest on the line. It was 108 feet high, and consisted of 27 arches, each 50 feet wide, with a total length of over 1600 feet. A boy hauling ballast for the line up an adjoining hillside early that morning (about 6.00 am) saw one arch (the fifth on the Rouen side) collapse, and the rest followed suit. Fortunately, no one was killed, although several workmen were injured in a mill below the structure. Locke attributed the catastrophic failure to frost action on the new lime cement, and premature off-centre loading of the viaduct with ballast. It was rebuilt at Thomas Brassey's cost, and survives to the present. Having pioneered many new lines in France, Locke also helped establish the first locomotive works in the country. Distinctive features of Locke's railway works were economy, the use of masonry bridges wherever possible and the absence of tunnels. An illustration of this is that there is no tunnel between Birmingham and Glasgow. Relationship with Robert Stephenson. Locke and Robert Stephenson had been good friends at the beginning of their careers, but their friendship had been marred by Locke's falling out with Robert's father. It seems that Robert felt loyalty to his father required that he should take his side. It is significant that after the death of George Stephenson in August 1848, the friendship of the two men was revived. When Robert Stephenson died in October 1859, Joseph Locke was a pallbearer at his funeral. Locke is reported to have referred to Robert as 'the friend of my youth, the companion of my ripening years, and a competitor in the race of life'. Locke was also on friendly terms with his other engineering rival, Isambard Kingdom Brunel. In 1845, Locke and Stephenson were both called to give evidence before two committees. In April a House of Commons Select Committee was investigating the atmospheric railway system proposed by Brunel. Brunel and Vignoles spoke in support of the system, whilst Locke and Stephenson spoke against it. The latter two were to be proved right in the long run. In August the two gave evidence before the Gauge Commissioners who were trying to arrive at a standard gauge for the whole country. Brunel spoke in favour of the 7 ft gauge he was using on the Great Western Railway. Locke and Stephenson spoke in favour of the 4 ft 8½in gauge that they had used on several lines. The latter two won the day and their gauge was adopted as the standard. Later life and legacy. Locke served as President of the Institution of Civil Engineers in between December 1857 and December 1859. He also served as Member of Parliament for Honiton in Devon from 1847 until his death. Joseph Locke died on 18 September 1860, apparently from appendicitis, whilst on a shooting holiday. He is buried in London's Kensal Green Cemetery. He outlived his friends/rivals Robert Stephenson and Isambard Brunel by less than a year; all three engineers died between 53 and 56 years of age, a circumstance attributed by Rolt to sheer overwork, accomplishing more in their brief lives than many achieve in a full three score and ten. Locke Park in Barnsley was dedicated to his memory by his widow Phoebe in 1862. It features a statue of Locke plus a folly, 'Locke Tower'. Locke's greatest legacy is the modern day West Coast Main Line (WCML), which was formed by the joining of the Caledonian, Lancaster & Carlisle, Grand Junction railways to Robert Stephenson's London & Birmingham Railway. As a result, around three-quarters of the WCML's route was planned and engineered by Locke.",
"query": "accurate approach",
"answers": {
"text": ["correct method"],
"answer_start": [2727]
}
}
```
### Data Fields
The data fields are the same among all subsets and splits.
* id: a string feature.
* title: a string feature.
* context: a string feature.
* question: a string feature.
* answers: a dictionary feature containing:
* text: a list of string features.
* answer_start: a list of int32 features.
### Data Splits
| name |train|validation|test|
|--------------------|----:|---------:|---:|
|PR-pass |20147| 3000|5000|
|PR-page |20098| 3000|5000|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The source passages + answers are from Wikipedia and the source of queries were produced by our hired linguistic experts from [Upwork.com](https://upwork.com).
#### Who are the source language producers?
We hired 13 linguistic experts from [Upwork.com](https://upwork.com) for annotation and more than 1000 human annotators on Mechanical Turk along with another set of 5 Upwork experts for 2-round verification.
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
13 linguistic experts from [Upwork.com](https://upwork.com).
### Personal and Sensitive Information
No annotator identifying details are provided.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset is a joint work between Adobe Research and Auburn University.
Creators: [Thang M. Pham](https://scholar.google.com/citations?user=eNrX3mYAAAAJ), [David Seunghyun Yoon](https://david-yoon.github.io/), [Trung Bui](https://sites.google.com/site/trungbuistanford/), and [Anh Nguyen](https://anhnguyen.me).
[@PMThangXAI](https://twitter.com/pmthangxai) added this dataset to HuggingFace.
### Licensing Information
This dataset is distributed under [Creative Commons Attribution-NonCommercial 4.0 International (CC-BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/)
### Citation Information
```
@article{pham2022PiC,
title={PiC: A Phrase-in-Context Dataset for Phrase Understanding and Semantic Search},
author={Pham, Thang M and Yoon, Seunghyun and Bui, Trung and Nguyen, Anh},
journal={arXiv preprint arXiv:2207.09068},
year={2022}
}
``` | [
-0.4005766808986664,
-0.4854542315006256,
0.6253951787948608,
0.29481253027915955,
-0.149898499250412,
-0.18190473318099976,
-0.01104743406176567,
-0.26299354434013367,
0.493338406085968,
0.3481229841709137,
-0.4388238191604614,
-0.24235370755195618,
-0.4122815728187561,
-0.024132082238793... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
arpelarpe/nota | arpelarpe | 2022-10-11T07:56:49Z | 72 | 2 | null | [
"task_categories:automatic-speech-recognition",
"multilinguality:monolingual",
"language:da",
"license:cc0-1.0",
"region:us"
] | 2022-10-11T07:56:49Z | 2022-10-11T06:37:42.000Z | 2022-10-11T06:37:42 | ---
pretty_name: Nota
license:
- cc0-1.0
language:
- da
multilinguality:
- monolingual
task_categories:
- automatic-speech-recognition
---
# Dataset Card Nota Lyd- og tekstdata
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Disclaimer](#disclaimer)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Homepage:** https://sprogteknologi.dk/dataset/notalyd-ogtekstdata
- **Data Storage Url:** https://sprogtek-ressources.digst.govcloud.dk/nota/
- **Point of Contact:** info@sprogteknologi.dk
### Dataset Summary
This data was created by the public institution Nota (https://nota.dk/), which is part of the Danish Ministry of Culture. Nota has a library audiobooks and audiomagazines for people with reading or sight disabilities. Nota also produces a number of audiobooks and audiomagazines themselves.
The dataset consists of .wav and .txt files from Nota's audiomagazines "Inspiration" and "Radio/TV".
The dataset has been published as a part of the initiative sprogteknologi.dk, within the Danish Agency for Digital Government (www.digst.dk).
336 GB available data, containing voice recordings and accompanying transcripts.
Each publication has been segmented into bits of 2 - 50 seconds .wav files with an accompanying transcription
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
Danish
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file, called path and its sentence.
`
{'path': '<path_to_clip>.wav', 'sentence': 'Dette er et eksempel', 'audio': {'path': <path_to_clip>.wav', 'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32), 'sampling_rate': 44100}
`
### Data Fields
path: The path to the audio file
audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
sentence: The sentence that was read by the speaker
### Data Splits
The material has for now only a train split. As this is very early stage of the dataset, splits might be introduced at a later stage.
## Dataset Creation
### Disclaimer
There might be smaller discrepancies between the .wav and .txt files. Therefore, there might be issues in the alignment of timestamps, text and sound files.
There are no strict rules as to how readers read aloud non-letter characters (i.e. numbers, €, $, !, ?). These symbols can be read differently throughout the dataset.
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset is made public and free to use. Recorded individuals has by written contract accepted and agreed to the publication of their recordings.
Other names appearing in the dataset are already publically known individuals (i.e. TV or Radio host names). Their names are not to be treated as sensitive or personal data in the context of this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
https://sprogteknologi.dk/
Contact info@sprogteknologi.dk if you have questions regarding use of data.
They gladly receive inputs and ideas on how to distribute the data.
### Licensing Information
[CC0-1.0](https://creativecommons.org/publicdomain/zero/1.0/)
### | [
-0.4831155836582184,
-0.5330892205238342,
0.19893373548984528,
0.30725449323654175,
-0.2375423163175583,
-0.027923505753278732,
-0.4297415614128113,
-0.47156575322151184,
0.5064242482185364,
0.6113149523735046,
-0.879410445690155,
-1.0206207036972046,
-0.6294000148773193,
0.273351341485977... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
suolyer/webqa | suolyer | 2023-02-23T02:12:12Z | 72 | 16 | null | [
"license:apache-2.0",
"region:us"
] | 2023-02-23T02:12:12Z | 2023-02-22T11:17:52.000Z | 2023-02-22T11:17:52 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bigbio/bronco | bigbio | 2023-04-01T16:47:31Z | 72 | 2 | null | [
"multilinguality:monolingual",
"language:de",
"region:us"
] | 2023-04-01T16:47:31Z | 2023-04-01T16:46:42.000Z | 2023-04-01T16:46:42 | ---
language:
- de
bigbio_language:
- German
multilinguality: monolingual
pretty_name: BRONCO150
homepage: https://www2.informatik.hu-berlin.de/~leser/bronco/index.html
bigbio_pubmed: false
bigbio_public: false
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- NAMED_ENTITY_DISAMBIGUATION
---
# Dataset Card for BRONCO150
## Dataset Description
- **Homepage:** https://www2.informatik.hu-berlin.de/~leser/bronco/index.html
- **Pubmed:** False
- **Public:** False
- **Tasks:** NER, NED
BRONCO150 is a corpus containing selected sentences of 150 German discharge summaries of cancer patients (hepatocelluar carcinoma or melanoma) treated at Charite Universitaetsmedizin Berlin or Universitaetsklinikum Tuebingen. All discharge summaries were manually anonymized. The original documents were scrambled at the sentence level to make reconstruction of individual reports impossible.
## Citation Information
```
@article{10.1093/jamiaopen/ooab025,
author = {Kittner, Madeleine and Lamping, Mario and Rieke, Damian T and Götze, Julian and Bajwa, Bariya and Jelas, Ivan and Rüter, Gina and Hautow, Hanjo and Sänger, Mario and Habibi, Maryam and Zettwitz, Marit and Bortoli, Till de and Ostermann, Leonie and Ševa, Jurica and Starlinger, Johannes and Kohlbacher, Oliver and Malek, Nisar P and Keilholz, Ulrich and Leser, Ulf},
title = "{Annotation and initial evaluation of a large annotated German oncological corpus}",
journal = {JAMIA Open},
volume = {4},
number = {2},
year = {2021},
month = {04},
issn = {2574-2531},
doi = {10.1093/jamiaopen/ooab025},
url = {https://doi.org/10.1093/jamiaopen/ooab025},
note = {ooab025},
eprint = {https://academic.oup.com/jamiaopen/article-pdf/4/2/ooab025/38830128/ooab025.pdf},
}
```
| [
-0.3123389184474945,
-0.3710043132305145,
0.318958044052124,
0.4927828013896942,
-0.15123867988586426,
-0.20563480257987976,
0.058626558631658554,
-0.17648060619831085,
0.4039563238620758,
0.8175966143608093,
-0.5597309470176697,
-1.0644419193267822,
-0.5496386289596558,
0.5090396404266357... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nicholasKluge/instruct-aira-dataset | nicholasKluge | 2023-11-10T12:53:27Z | 72 | 2 | null | [
"task_categories:conversational",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:pt",
"language:en",
"language:es",
"license:apache-2.0",
"alignment",
"instruction",
"chat",
"region:us"
] | 2023-11-10T12:53:27Z | 2023-06-07T17:09:55.000Z | 2023-06-07T17:09:55 | ---
license: apache-2.0
task_categories:
- conversational
- text-generation
language:
- pt
- en
- es
tags:
- alignment
- instruction
- chat
pretty_name: Instruct-Aira Dataset
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: portuguese
num_bytes: 53113297
num_examples: 41815
- name: english
num_bytes: 47263211
num_examples: 41815
- name: spanish
num_bytes: 54272293
num_examples: 41815
download_size: 86279324
dataset_size: 154648801
---
# Dataset (`Instruct-Aira Dataset`)
### Overview
This dataset contains a collection of demonstrations on how to answer questions and follow instructions. We used prompts from the [`synthetic-instruct-gptj-pairwise`](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise) dataset, the [`databricks_dolly_15k`](https://huggingface.co/datasets/HuggingFaceH4/databricks_dolly_15k) dataset, and the [`instruction-dataset`](https://huggingface.co/datasets/HuggingFaceH4/instruction-dataset) dataset, to create an instruction-tuning dataset, where the completions were generated by already tuned models (ChatGPT, LLama 2, Open-Assistant, etc). The dataset is available in both Portuguese, English, and Spanish.
### Dataset Details
- **Dataset Name:** Instruct-Aira Dataset
- **Language:** Portuguese, English, Spanish
- **Total Size:** Over 41,000 demonstrations
### Contents
The dataset consists of data frames with the following columns:
- **Prompt:** The initial text or question provided to the model.
- **Completion:** The demonstration of a generated completion or response for the given prompt.
```python
{
"prompt":"What is the capital of Brazil?",
"completion": "The capital of Brazil is Brasília."
}
```
All `prompt + completion` examples are less than 400 tokens (measured using the `GPT-2` and `BLOOM` tokenizers).
### Use Cases
`Instruct-Aira Dataset` can be utilized for various natural language processing tasks, including but not limited to:
- Language generation.
- Question-answering systems.
- Chatbot development.
- Evaluation of language models.
- AI ethics research.
- Alignment research.
## How to use
Available splits are `english` and `portuguese`.
```python
from datasets import load_dataset
dataset = load_dataset("nicholasKluge/instruct-aira-dataset")
```
### Disclaimer
This dataset is provided as is, without any warranty or guarantee of its accuracy or suitability for any purpose. The creators and contributors of this dataset are not liable for any damages or losses arising from its use. Please review and comply with the licenses and terms of the original datasets before use. | [
-0.32158467173576355,
-1.0433831214904785,
0.19401265680789948,
0.3277653157711029,
-0.024715038016438484,
-0.09198101609945297,
-0.15113484859466553,
0.025417190045118332,
0.21901430189609528,
0.5277559757232666,
-0.6667211651802063,
-0.38057151436805725,
-0.4180154502391815,
0.0514186918... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
v-xchen-v/agieval_eng_qa | v-xchen-v | 2023-07-16T11:04:52Z | 72 | 1 | null | [
"license:mit",
"region:us"
] | 2023-07-16T11:04:52Z | 2023-07-10T06:03:23.000Z | 2023-07-10T06:03:23 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
talby/spamassassin | talby | 2023-07-11T18:36:22Z | 72 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-07-11T18:36:22Z | 2023-07-10T17:59:18.000Z | 2023-07-10T17:59:18 | ---
license: unknown
---
# Dataset Card for the SpamAssassin public mail corpus
## Dataset Description
- **Homepage:** https://spamassassin.apache.org/old/publiccorpus/readme.html
### Dataset Summary
This is a selection of mail messages, suitable for use in testing spam filtering systems assembled by members of the SpamAssassin project.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
- The `text` config normalizes all character sets to utf8 and dumps the
MIME tree as a JSON list of lists.
- The `unprocessed` config does not parse messages at all, leaving the
full headers and content as binary.
### Data Fields
- `label`: `spam` or `ham`
- `group`: SpamAssassin has grouped these samples into categories
{'hard_ham', 'spam_2', 'spam', 'easy_ham', 'easy_ham_2'}
- `text`: normalized text of the message bodies
- `raw`: full binary headers and contents of messages
### Data Splits
Only a _train_ split has been provided.
## Dataset Creation
### Curation Rationale
It is hoped this dataset can help verify that modern NLP tools can solve
old NLP problems.
### Source Data
#### Initial Data Collection and Normalization
[The upstream corpus description](https://spamassassin.apache.org/old/publiccorpus/readme.html)
goes into detail on collection methods. The work here to recover text bodies
is largely done with [email.parser](https://docs.python.org/3/library/email.parser.html)
and [ftfy](https://pypi.org/project/ftfy/).
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.454862505197525,
-0.39378926157951355,
-0.032892853021621704,
0.20720817148685455,
-0.27877259254455566,
-0.24587464332580566,
-0.30326542258262634,
-0.1566317230463028,
0.3272475004196167,
0.9222432971000671,
-0.637323260307312,
-0.6956248879432678,
-0.9324012398719788,
0.5618140101432... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
FreedomIntelligence/CMB | FreedomIntelligence | 2023-08-19T09:45:53Z | 72 | 7 | null | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:zh",
"license:apache-2.0",
"medical",
"biology",
"chemistry",
"region:us"
] | 2023-08-19T09:45:53Z | 2023-07-20T09:08:03.000Z | 2023-07-20T09:08:03 | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- zh
tags:
- medical
- biology
- chemistry
size_categories:
- 100K<n<1M
---
# CMB: A Comprehensive Medical Benchmark in Chinese

<p align="center">
🌐 <a href="https://cmedbenchmark.llmzoo.com/#home" target="_blank">Website</a> • 🤗 <a href="https://huggingface.co/datasets/FreedomIntelligence/CMB" target="_blank">HuggingFace</a>
## 🌈 Update
* **[2023.08.01]** 🎉🎉🎉 CMB is published!🎉🎉🎉
## 🌐 Download Data
- (Recommended) Download the [zip file](https://github.com/FreedomIntelligence/CMB/tree/main/data) and unzip:
```bash
git clone "https://github.com/FreedomIntelligence/CMB.git" && cd CMB && unzip "./data/CMB.zip" -d "./data/" && rm "./data/CMB.zip"
```
- Or load our data as follows:
```python
from datasets import load_dataset
# CMB-Exam datasets (multiple-choice and multiple-answer questions)
exam_datasets = load_dataset('FreedomIntelligence/CMB','exam')
# CMB-Clin datasets
clin_datasets = load_dataset('FreedomIntelligence/CMB','clin')
```
## 🥇 Leaderboard
Please Check [Leaderboard](https://cmedbenchmark.llmzoo.com/static/leaderboard.html).
## 🥸 Dataset intro

### Components
- CMB-Exam: Comprehensive multi-level assessment for medical knowledge
- Structure: 6 major categories and 28 subcategories, [View Catalog](catalog.md)
- CMB-test: 400 questions per subcategories, 11200 questions in total
- CMB-val: 280 questions with solutions and explanations; used as source for CoT and few-shot
- CMB-train: 269359 questions for medical knowledge injection
- CMB-Clin: 74 cases of complex medical inquires
### CMB-Exam Item
```json
{
"exam_type": "医师考试",
"exam_class": "执业医师",
"exam_subject": "口腔执业医师",
"question": "患者,男性,11岁。近2个月来时有低热(37~38℃),全身无明显症状。查体无明显阳性体征。X线检查发现右肺中部有一直径约0.8cm类圆形病灶,边缘稍模糊,肺门淋巴结肿大。此男孩可能患",
"answer": "D",
"question_type": "单项选择题",
"option": {
"A": "小叶型肺炎",
"B": "浸润性肺结核",
"C": "继发性肺结核",
"D": "原发性肺结核",
"E": "粟粒型肺结核"
}
},
```
- exam_type: major category
- exam_class: sub-category
- exam_subject: Specific departments or subdivisions of disciplines
- question_type: *multiple-choice (单项选择题)* or *multiple-answer (多项选择题)*
### CMB-Clin Item
```json
{
"id": 0,
"title": "案例分析-腹外疝",
"description": "现病史\n(1)病史摘要\n 病人,男,49岁,3小时前解大便后出现右下腹疼痛,右下腹可触及一包块,既往体健。\n(2)主诉\n 右下腹痛并自扪及包块3小时。\n\n体格检查\n体温: T 37.8℃,P 101次/分,呼吸22次/分,BP 100/60mmHg,腹软,未见胃肠型蠕动波,肝脾肋下未及,于右侧腹股沟区可扪及一圆形肿块,约4cm×4cm大小,有压痛、界欠清,且肿块位于腹股沟韧带上内方。\n\n辅助检查\n(1)实验室检查\n 血常规:WBC 5.0×109/L,N 78%。\n 尿常规正常。\n(2)多普勒超声检查\n 沿腹股沟纵切可见一多层分布的混合回声区,宽窄不等,远端膨大,边界整齐,长约4~5cm。\n(3)腹部X线检查\n 可见阶梯状液气平。",
"QA_pairs": [
{
"question": "简述该病人的诊断及诊断依据。",
"solution": "诊断:嵌顿性腹股沟斜疝合并肠梗阻。\n诊断依据:\n①右下腹痛并自扪及包块3小时;\n②有腹胀、呕吐,类似肠梗阻表现;腹部平片可见阶梯状液平,考虑肠梗阻可能;腹部B超考虑,\n腹部包块内可能为肠管可能;\n③有轻度毒性反应或是中毒反应,如 T 37.8℃,P 101次/分,白细胞中性分类78%;\n④腹股沟区包块位于腹股沟韧带上内方。"
},
{
"question": "简述该病人的鉴别诊断。",
"solution": "(1)睾丸鞘膜积液:鞘膜积液所呈现的肿块完全局限在阴囊内,其上界可以清楚地摸到;用透光试验检查肿块,鞘膜积液多为透光(阳性),而疝块则不能透光。\n(2)交通性鞘膜积液:肿块的外形与睾丸鞘膜积液相似。于每日起床后或站立活动时肿块缓慢地出现并增大。平卧或睡觉后肿块逐渐缩小,挤压肿块,其体积也可逐渐缩小。透光试验为阳性。\n(3)精索鞘膜积液:肿块较小,在腹股沟管内,牵拉同侧睾丸可见肿块移动。\n(4)隐睾:腹股沟管内下降不全的睾丸可被误诊为斜疝或精索鞘膜积液。隐睾肿块较小,挤压时可出现特有的胀痛感觉。如患侧阴囊内睾丸缺如,则诊断更为明确。\n(5)急性肠梗阻:肠管被嵌顿的疝可伴发急性肠梗阻,但不应仅满足于肠梗阻的诊断而忽略疝的存在;尤其是病人比较肥胖或疝块较小时,更易发生这类问题而导致治疗上的错误。\n(6)此外,腹股沟区肿块还应与以下疾病鉴别:肿大的淋巴结、动(静)脉瘤、软组织肿瘤、脓肿、\n圆韧带囊肿、子宫内膜异位症等。"
},
{
"question": "简述该病人的治疗原则。",
"solution": "嵌顿性疝原则上需要紧急手术治疗,以防止疝内容物坏死并解除伴发的肠梗阻。术前应做好必要的准备,如有脱水和电解质紊乱,应迅速补液加以纠正。手术的关键在于正确判断疝内容物的活力,然后根据病情确定处理方法。在扩张或切开疝环、解除疝环压迫的前提下,凡肠管呈紫黑色,失去光泽和弹性,刺激后无蠕动和相应肠系膜内无动脉搏动者,即可判定为肠坏死。如肠管尚未坏死,则可将其送回腹腔,按一般易复性疝处理,即行疝囊高位结扎+疝修补术。如肠管确已坏死或一时不能肯定肠管是否已失去活力时,则应在病人全身情况允许的前提下,切除该段肠管并进行一期吻合。凡施行肠切除吻合术的病人,因手术区污染,在高位结扎疝囊后,一般不宜作疝修补术,以免因感染而致修补失败。"
}
]
},
```
- title: name of disease
- description: information of patient
- QA_pairs: a series of questions and their solutions based on the description
## ℹ️ How to evaluate and submit refer to [link](https://github.com/FreedomIntelligence/CMB)
## 😘 Citation
Please use the following citation if you intend to use our dataset for training or evaluation:
```
@misc{cmedbenchmark,
title={CMB: Chinese Medical Benchmark},
author={Xidong Wang*, Guiming Hardy Chen*, Dingjie Song*, Zhiyi Zhang*, Qingying Xiao, Xiangbo Wu, Feng Jiang, Jianquan Li, Benyou Wang},
note={Xidong Wang, Guiming Hardy Chen, Dingjie Song, and Zhiyi Zhang contributed equally to this github repo.},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/FreedomIntelligence/CMB}},
}
```
## Acknowledgement
- We thank [Shenzhen Research Institute of Big Data](http://www.sribd.cn/) for their enormous support for this project.
- We thank the following doctors for participating in the human evaluation of CMB-Clin:
- 林士军 (香港中文大学(深圳)附属第二医院)
- 常河
- 许晓爽
| [
-0.5504266023635864,
-0.638612687587738,
0.5098429918289185,
0.2267938107252121,
-0.5849028825759888,
-0.26458215713500977,
-0.14126983284950256,
-0.1931179016828537,
0.5690473318099976,
0.19567392766475677,
-0.42502930760383606,
-0.8576377630233765,
-0.48957470059394836,
0.162677705287933... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
youssef101/artelingo-dummy | youssef101 | 2023-07-23T16:21:23Z | 72 | 1 | null | [
"task_categories:image-to-text",
"task_categories:text-classification",
"task_categories:image-classification",
"task_categories:text-to-image",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"language:ar",
"language:zh",
"license:mit",
"Affective Captioning",
"... | 2023-07-23T16:21:23Z | 2023-07-23T14:41:17.000Z | 2023-07-23T14:41:17 | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: art_style
dtype: string
- name: painting
dtype: string
- name: emotion
dtype: string
- name: language
dtype: string
- name: text
dtype: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 18587167692.616
num_examples: 62989
- name: validation
num_bytes: 965978050.797
num_examples: 3191
- name: test
num_bytes: 2330046601.416
num_examples: 6402
download_size: 4565327615
dataset_size: 21883192344.829002
task_categories:
- image-to-text
- text-classification
- image-classification
- text-to-image
- text-generation
language:
- en
- ar
- zh
tags:
- Affective Captioning
- Emotions
- Prediction
- Art
- ArtELingo
pretty_name: ArtELingo
size_categories:
- 100K<n<1M
---
ArtELingo is a benchmark and dataset introduced in a research paper aimed at promoting work on diversity across languages and cultures. It is an extension of ArtEmis, which is a collection of 80,000 artworks from WikiArt with 450,000 emotion labels and English-only captions. ArtELingo expands this dataset by adding 790,000 annotations in Arabic and Chinese. The purpose of these additional annotations is to evaluate the performance of "cultural-transfer" in AI systems.
The dataset in ArtELingo contains many artworks with multiple annotations in three languages, providing a diverse set of data that enables the study of similarities and differences across languages and cultures. The researchers investigate captioning tasks and find that diversity in annotations improves the performance of baseline models.
The goal of ArtELingo is to encourage research on multilinguality and culturally-aware AI. By including annotations in multiple languages and considering cultural differences, the dataset aims to build more human-compatible AI that is sensitive to emotional nuances across various cultural contexts. The researchers believe that studying emotions in this way is crucial to understanding a significant aspect of human intelligence.
In summary, ArtELingo is a dataset that extends ArtEmis by providing annotations in multiple languages and cultures, facilitating research on diversity in AI systems and improving their performance in emotion-related tasks like label prediction and affective caption generation. The dataset is publicly available, and the researchers hope that it will facilitate future studies in multilingual and culturally-aware artificial intelligence. | [
-0.6059170961380005,
-0.17338769137859344,
-0.07522932440042496,
0.31100374460220337,
-0.3320482075214386,
-0.2976214587688446,
0.006921157240867615,
-1.0658894777297974,
0.17862048745155334,
0.035042259842157364,
-0.26658228039741516,
-0.4937658905982971,
-0.6341790556907654,
0.6120194196... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Yuhthe/vietnews_word | Yuhthe | 2023-07-26T03:00:48Z | 72 | 0 | null | [
"task_categories:summarization",
"language:vi",
"region:us"
] | 2023-07-26T03:00:48Z | 2023-07-25T10:14:56.000Z | 2023-07-25T10:14:56 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: guid
dtype: int64
- name: title
dtype: string
- name: abstract
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 325418455
num_examples: 99134
- name: validation
num_bytes: 73397317
num_examples: 22184
- name: test
num_bytes: 74536959
num_examples: 22498
download_size: 246524133
dataset_size: 473352731
task_categories:
- summarization
language:
- vi
---
# Dataset Card for "vietnews_word"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.3643699884414673,
-0.3493482768535614,
0.2884140610694885,
0.29009366035461426,
-0.4748726785182953,
-0.04629191383719444,
0.08186610788106918,
-0.110246442258358,
0.8052920699119568,
0.6811709403991699,
-0.7367669343948364,
-0.9798953533172607,
-0.8001647591590881,
-0.19895532727241516... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ahmed-masry/unichart-pretrain-data | ahmed-masry | 2023-07-30T01:39:51Z | 72 | 1 | null | [
"region:us"
] | 2023-07-30T01:39:51Z | 2023-07-30T01:39:33.000Z | 2023-07-30T01:39:33 | ---
dataset_info:
features:
- name: imgname
dtype: string
- name: query
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 1198892722
num_examples: 6898333
download_size: 346172299
dataset_size: 1198892722
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "unichart-pretrain-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5575113296508789,
0.08414893597364426,
0.23763999342918396,
0.09811147302389145,
-0.41778287291526794,
0.10686799138784409,
0.17840176820755005,
0.00812253262847662,
0.7530189156532288,
0.3700107932090759,
-0.9765713214874268,
-0.7932093739509583,
-0.5478115081787109,
-0.437451779842376... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct | open-llm-leaderboard | 2023-08-27T12:27:20Z | 72 | 4 | null | [
"region:us"
] | 2023-08-27T12:27:20Z | 2023-08-18T00:07:51.000Z | 2023-08-18T00:07:51 | ---
pretty_name: Evaluation run of garage-bAInd/Platypus2-70B-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [garage-bAInd/Platypus2-70B-instruct](https://huggingface.co/garage-bAInd/Platypus2-70B-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-10T02:33:24.373535](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct/blob/main/results_2023-08-10T02%3A33%3A24.373535.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.704161183233485,\n\
\ \"acc_stderr\": 0.030994657706769527,\n \"acc_norm\": 0.7079756766205294,\n\
\ \"acc_norm_stderr\": 0.03096353733559372,\n \"mc1\": 0.4430844553243574,\n\
\ \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6225956874268151,\n\
\ \"mc2_stderr\": 0.014795440403830226\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173302,\n\
\ \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009015\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.691894045010954,\n\
\ \"acc_stderr\": 0.004607669909914972,\n \"acc_norm\": 0.8794064927305317,\n\
\ \"acc_norm_stderr\": 0.0032498873947065104\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.03008574324856567,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.03008574324856567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.025634258115554958,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.025634258115554958\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.02141724293632158,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.02141724293632158\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.025485498373343237,\n\
\ \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.025485498373343237\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194209,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194209\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262575,\n\
\ \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622842,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622842\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.026841514322958927,\n\
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.026841514322958927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9064220183486239,\n \"acc_stderr\": 0.012486841824601963,\n \"\
acc_norm\": 0.9064220183486239,\n \"acc_norm_stderr\": 0.012486841824601963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564026,\n \"\
acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564026\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407266,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407266\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941646,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941646\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.01203672956821606,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.01203672956821606\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6525139664804469,\n\
\ \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.6525139664804469,\n\
\ \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n\
\ \"acc_stderr\": 0.023598858292863054,\n \"acc_norm\": 0.7781350482315113,\n\
\ \"acc_norm_stderr\": 0.023598858292863054\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02073635840806,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02073635840806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.02955545423677884,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.02955545423677884\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5886571056062582,\n\
\ \"acc_stderr\": 0.012567882673803692,\n \"acc_norm\": 0.5886571056062582,\n\
\ \"acc_norm_stderr\": 0.012567882673803692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.027033041151681456,\n\
\ \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.027033041151681456\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146606,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146606\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.026711430555538408,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.026711430555538408\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4430844553243574,\n\
\ \"mc1_stderr\": 0.017389730346877103,\n \"mc2\": 0.6225956874268151,\n\
\ \"mc2_stderr\": 0.014795440403830226\n }\n}\n```"
repo_url: https://huggingface.co/garage-bAInd/Platypus2-70B-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|arc:challenge|25_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hellaswag|10_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:33:24.373535.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:33:24.373535.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T02:33:24.373535.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-10T02:33:24.373535.parquet'
- config_name: results
data_files:
- split: 2023_08_10T02_33_24.373535
path:
- results_2023-08-10T02:33:24.373535.parquet
- split: latest
path:
- results_2023-08-10T02:33:24.373535.parquet
---
# Dataset Card for Evaluation run of garage-bAInd/Platypus2-70B-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/garage-bAInd/Platypus2-70B-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [garage-bAInd/Platypus2-70B-instruct](https://huggingface.co/garage-bAInd/Platypus2-70B-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T02:33:24.373535](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Platypus2-70B-instruct/blob/main/results_2023-08-10T02%3A33%3A24.373535.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.704161183233485,
"acc_stderr": 0.030994657706769527,
"acc_norm": 0.7079756766205294,
"acc_norm_stderr": 0.03096353733559372,
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6225956874268151,
"mc2_stderr": 0.014795440403830226
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173302,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009015
},
"harness|hellaswag|10": {
"acc": 0.691894045010954,
"acc_stderr": 0.004607669909914972,
"acc_norm": 0.8794064927305317,
"acc_norm_stderr": 0.0032498873947065104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.03008574324856567,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.03008574324856567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.025634258115554958,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.025634258115554958
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632158,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632158
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.025485498373343237,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.025485498373343237
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194209,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194209
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262575,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622842,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622842
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.026841514322958927,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.026841514322958927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9064220183486239,
"acc_stderr": 0.012486841824601963,
"acc_norm": 0.9064220183486239,
"acc_norm_stderr": 0.012486841824601963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.01926932302564026,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.01926932302564026
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407266,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407266
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.625,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941646,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941646
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.01203672956821606,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.01203672956821606
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6525139664804469,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.6525139664804469,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.023598858292863054,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.023598858292863054
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02073635840806,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02073635840806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.02955545423677884,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.02955545423677884
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5886571056062582,
"acc_stderr": 0.012567882673803692,
"acc_norm": 0.5886571056062582,
"acc_norm_stderr": 0.012567882673803692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.027033041151681456,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.027033041151681456
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146606,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146606
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.026711430555538408,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.026711430555538408
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877103,
"mc2": 0.6225956874268151,
"mc2_stderr": 0.014795440403830226
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7192395925521851,
-0.8556075692176819,
0.2809979021549225,
0.20730018615722656,
-0.16408874094486237,
-0.018490634858608246,
-0.008638995699584484,
-0.18721428513526917,
0.5503597259521484,
-0.05026370659470558,
-0.5049427151679993,
-0.6935425996780396,
-0.4178098440170288,
0.2164053916... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
tomaarsen/conll2002 | tomaarsen | 2023-09-23T10:53:11Z | 72 | 0 | conll-2002 | [
"task_categories:token-classification",
"task_ids:named-entity-recognition",
"task_ids:part-of-speech",
"annotations_creators:crowdsourced",
"language_creators:found",
"multilinguality:multilingual",
"size_categories:10K<n<100K",
"source_datasets:original",
"language:es",
"language:nl",
"license... | 2023-09-23T10:53:11Z | 2023-09-23T10:04:25.000Z | 2023-09-23T10:04:25 | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- es
- nl
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
- part-of-speech
paperswithcode_id: conll-2002
pretty_name: CoNLL-2002
config_names:
- es
- nl
dataset_info:
- config_name: es
features:
- name: id
dtype: string
- name: document_id
dtype: int32
- name: sentence_id
dtype: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': AO
'1': AQ
'2': CC
'3': CS
'4': DA
'5': DE
'6': DD
'7': DI
'8': DN
'9': DP
'10': DT
'11': Faa
'12': Fat
'13': Fc
'14': Fd
'15': Fe
'16': Fg
'17': Fh
'18': Fia
'19': Fit
'20': Fp
'21': Fpa
'22': Fpt
'23': Fs
'24': Ft
'25': Fx
'26': Fz
'27': I
'28': NC
'29': NP
'30': P0
'31': PD
'32': PI
'33': PN
'34': PP
'35': PR
'36': PT
'37': PX
'38': RG
'39': RN
'40': SP
'41': VAI
'42': VAM
'43': VAN
'44': VAP
'45': VAS
'46': VMG
'47': VMI
'48': VMM
'49': VMN
'50': VMP
'51': VMS
'52': VSG
'53': VSI
'54': VSM
'55': VSN
'56': VSP
'57': VSS
'58': Y
'59': Z
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 6738717
num_examples: 8323
- name: validation
num_bytes: 1349064
num_examples: 1915
- name: test
num_bytes: 1306252
num_examples: 1517
download_size: 4140690
dataset_size: 9394033
- config_name: nl
features:
- name: id
dtype: string
- name: document_id
dtype: int32
- name: sentence_id
dtype: int32
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': Adj
'1': Adv
'2': Art
'3': Conj
'4': Int
'5': Misc
'6': N
'7': Num
'8': Prep
'9': Pron
'10': Punc
'11': V
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 5435346
num_examples: 15806
- name: validation
num_bytes: 1017418
num_examples: 2895
- name: test
num_bytes: 1850382
num_examples: 5195
download_size: 3642241
dataset_size: 8303146
---
# Dataset Card for CoNLL-2002
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [homepage](https://www.clips.uantwerpen.be/conll2002/ner/)
- **Repository:** [github](https://github.com/teropa/nlp/tree/master/resources/corpora/conll2002)
- **Paper:** [paper](https://www.aclweb.org/anthology/W02-2024/)
- **Point of Contact:** [Erik Tjong Kim Sang](erikt@uia.ua.ac.be)
### Dataset Summary
Named entities are phrases that contain the names of persons, organizations, locations, times and quantities. Example:
[PER Wolff] , currently a journalist in [LOC Argentina] , played with [PER Del Bosque] in the final years of the seventies in [ORG Real Madrid] .
The shared task of CoNLL-2002 concerns language-independent named entity recognition. We will concentrate on four types of named entities: persons, locations, organizations and names of miscellaneous entities that do not belong to the previous three groups. The participants of the shared task will be offered training and test data for at least two languages. They will use the data for developing a named-entity recognition system that includes a machine learning component. Information sources other than the training data may be used in this shared task. We are especially interested in methods that can use additional unannotated data for improving their performance (for example co-training).
### Supported Tasks and Leaderboards
Named Entity Recognition (NER) is a subtask of Information Extraction. Different NER systems were evaluated as a part of the Sixth Message Understanding Conference in 1995 (MUC6). The target language was English. The participating systems performed well. However, many of them used language-specific resources for performing the task and it is unknown how they would have performed on another language than English.
After 1995 NER systems have been developed for some European languages and a few Asian languages. There have been at least two studies that have applied one NER system to different languages. Palmer and Day [PD97] have used statistical methods for finding named entities in newswire articles in Chinese, English, French, Japanese, Portuguese and Spanish. They found that the difficulty of the NER task was different for the six languages but that a large part of the task could be performed with simple methods. Cucerzan and Yarowsky [CY99] used both morphological and contextual clues for identifying named entities in English, Greek, Hindi, Rumanian and Turkish. With minimal supervision, they obtained overall F measures between 40 and 70, depending on the languages used.
- `named-entity-recognition`: The performance in this task is measured with [F1](https://huggingface.co/metrics/f1) (higher is better). A named entity is correct only if it is an exact match of the corresponding entity in the data.
- `parsing`: The performance in this task is measured with [F1](https://huggingface.co/metrics/f1) (higher is better). A part-of-speech tag is correct only if it is equal to the corresponding tag in the data.
### Languages
There are two languages available : Spanish (es) and Dutch (nl).
## Dataset Structure
### Data Instances
The examples look like this :
```
{
'id': '0',
'document_id': 0,
'sentence_id': 0,
'tokens': ['Melbourne', '(', 'Australia', ')', ',', '25', 'may', '(', 'EFE', ')', '.'],
'pos_tags': [29, 21, 29, 22, 13, 59, 28, 21, 28, 22, 20],
'ner_tags': [5, 0, 5, 0, 0, 0, 0, 0, 3, 0, 0]
}
```
The original data files within the Dutch sub-dataset have `-DOCSTART-` lines used to separate documents, but these lines are removed here.
Indeed `-DOCSTART-` is a special line that acts as a boundary between two different documents, and it is filtered out in this implementation.
### Data Fields
- `id`: id of the sample
- `document_id`: an `int32` feature tracking which document the sample is from.
- `sentence_id`: an `int32` feature tracking which sentence in this document the sample is from.
- `tokens`: the tokens of the example text
- `ner_tags`: the NER tags of each token
- `pos_tags`: the POS tags of each token
The POS tags correspond to this list for Spanish:
```
'AO', 'AQ', 'CC', 'CS', 'DA', 'DE', 'DD', 'DI', 'DN', 'DP', 'DT', 'Faa', 'Fat', 'Fc', 'Fd', 'Fe', 'Fg', 'Fh', 'Fia', 'Fit', 'Fp', 'Fpa', 'Fpt', 'Fs', 'Ft', 'Fx', 'Fz', 'I', 'NC', 'NP', 'P0', 'PD', 'PI', 'PN', 'PP', 'PR', 'PT', 'PX', 'RG', 'RN', 'SP', 'VAI', 'VAM', 'VAN', 'VAP', 'VAS', 'VMG', 'VMI', 'VMM', 'VMN', 'VMP', 'VMS', 'VSG', 'VSI', 'VSM', 'VSN', 'VSP', 'VSS', 'Y', 'Z'
```
And this list for Dutch:
```
'Adj', 'Adv', 'Art', 'Conj', 'Int', 'Misc', 'N', 'Num', 'Prep', 'Pron', 'Punc', 'V'
```
The NER tags correspond to this list:
```
"O", "B-PER", "I-PER", "B-ORG", "I-ORG", "B-LOC", "I-LOC", "B-MISC", "I-MISC",
```
The NER tags have the same format as in the chunking task: a B denotes the first item of a phrase and an I any non-initial word. There are four types of phrases: person names (PER), organizations (ORG), locations (LOC) and miscellaneous names (MISC).
It is assumed that named entities are non-recursive and non-overlapping. In case a named entity is embedded in another named entity usually, only the top level entity is marked.
### Data Splits
For both configurations (Spanish and Dutch), there are three splits.
The original splits were named `train`, `testa` and `testb` and they correspond to the `train`, `validation` and `test` splits.
The splits have the following sizes :
| | train | validation | test |
| ----- |-------:|------------:|------:|
| N. Examples (Spanish) | 8324 | 1916 | 1518 |
| N. Examples (Dutch) | 15807 | 2896 | 5196 |
## Dataset Creation
### Curation Rationale
The dataset was introduced to introduce new resources to two languages that were under-served for statistical machine learning at the time, Dutch and Spanish.
[More Information Needed]
### Source Data
The Spanish data is a collection of news wire articles made available by the Spanish EFE News Agency. The articles are from May 2000.
The Dutch data consist of four editions of the Belgian newspaper "De Morgen" of 2000 (June 2, July 1, August 1 and September 1).
#### Initial Data Collection and Normalization
The articles were word-tokenized, information on the exact pre-processing pipeline is unavailable.
#### Who are the source language producers?
The source language was produced by journalists and writers employed by the news agency and newspaper mentioned above.
### Annotations
#### Annotation process
For the Dutch data, the annotator has followed the MITRE and SAIC guidelines for named entity recognition (Chinchor et al., 1999) as well as possible.
#### Who are the annotators?
The Spanish data annotation was carried out by the TALP Research Center of the Technical University of Catalonia (UPC) and the Center of Language and Computation (CLiC) of the University of Barcelona (UB).
The Dutch data was annotated as a part of the Atranos project at the University of Antwerp.
### Personal and Sensitive Information
The data is sourced from newspaper source and only contains mentions of public figures or individuals
## Considerations for Using the Data
### Social Impact of Dataset
Named Entity Recognition systems can be used to efficiently index news text, allowing to easily gather all information pertaining to an organization or individual. Making such resources widely available in languages other than English can support better research and user experience for a larger part of the world's population. At the same time, better indexing and discoverability can also enable surveillance by state actors.
### Discussion of Biases
News text reproduces the biases of society, and any system trained on news data should be cognizant of these limitations and the risk for models to learn spurious correlations in this context, for example between a person's gender and their occupation.
### Other Known Limitations
Users should keep in mind that the dataset only contains news text, which might limit the applicability of the developed systems to other domains.
## Additional Information
### Dataset Curators
The annotation of the Spanish data was funded by the European Commission through the NAMIC project (IST-1999-12392).
### Licensing Information
The licensing status of the data, especially the news source text, is unknown.
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@inproceedings{tjong-kim-sang-2002-introduction,
title = "Introduction to the {C}o{NLL}-2002 Shared Task: Language-Independent Named Entity Recognition",
author = "Tjong Kim Sang, Erik F.",
booktitle = "{COLING}-02: The 6th Conference on Natural Language Learning 2002 ({C}o{NLL}-2002)",
year = "2002",
url = "https://www.aclweb.org/anthology/W02-2024",
}
```
### Contributions
Thanks to [@lhoestq](https://github.com/lhoestq) for adding this dataset. | [
-0.5553497672080994,
-0.7344887256622314,
0.1775791198015213,
0.4285578429698944,
-0.27919939160346985,
0.09241008758544922,
-0.3875266909599304,
-0.6234205365180969,
0.6255055665969849,
0.41933321952819824,
-0.5595343112945557,
-0.7266944050788879,
-0.6361418962478638,
0.5058064460754395,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit | open-llm-leaderboard | 2023-10-04T17:18:22Z | 72 | 0 | null | [
"region:us"
] | 2023-10-04T17:18:22Z | 2023-10-04T17:17:21.000Z | 2023-10-04T17:17:21 | ---
pretty_name: Evaluation run of upstage/SOLAR-0-70b-16bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [upstage/SOLAR-0-70b-16bit](https://huggingface.co/upstage/SOLAR-0-70b-16bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-04T17:16:57.736703](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit/blob/main/results_2023-10-04T17-16-57.736703.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7050740464217434,\n\
\ \"acc_stderr\": 0.03085018588043536,\n \"acc_norm\": 0.7087855823993987,\n\
\ \"acc_norm_stderr\": 0.03081992944181276,\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n\
\ \"mc2_stderr\": 0.014880875055625352\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587333,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6974706233817964,\n\
\ \"acc_stderr\": 0.00458414401465495,\n \"acc_norm\": 0.8789085839474209,\n\
\ \"acc_norm_stderr\": 0.0032556675321152857\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.029674167520101453,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.029674167520101453\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n\
\ \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131792,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131792\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5615763546798029,\n \"acc_stderr\": 0.03491207857486519,\n\
\ \"acc_norm\": 0.5615763546798029,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880242,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880242\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.04075224992216979,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.04075224992216979\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n\
\ \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n\
\ \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6044692737430167,\n\
\ \"acc_stderr\": 0.01635341541007577,\n \"acc_norm\": 0.6044692737430167,\n\
\ \"acc_norm_stderr\": 0.01635341541007577\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n\
\ \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n\
\ \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060006,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060006\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5521512385919165,\n\
\ \"acc_stderr\": 0.012700582404768235,\n \"acc_norm\": 0.5521512385919165,\n\
\ \"acc_norm_stderr\": 0.012700582404768235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\
\ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n\
\ \"mc2_stderr\": 0.014880875055625352\n }\n}\n```"
repo_url: https://huggingface.co/upstage/SOLAR-0-70b-16bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|arc:challenge|25_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hellaswag|10_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T17-16-57.736703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T17-16-57.736703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T17-16-57.736703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T17-16-57.736703.parquet'
- config_name: results
data_files:
- split: 2023_10_04T17_16_57.736703
path:
- results_2023-10-04T17-16-57.736703.parquet
- split: latest
path:
- results_2023-10-04T17-16-57.736703.parquet
---
# Dataset Card for Evaluation run of upstage/SOLAR-0-70b-16bit
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/SOLAR-0-70b-16bit
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/SOLAR-0-70b-16bit](https://huggingface.co/upstage/SOLAR-0-70b-16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-04T17:16:57.736703](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-0-70b-16bit/blob/main/results_2023-10-04T17-16-57.736703.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7050740464217434,
"acc_stderr": 0.03085018588043536,
"acc_norm": 0.7087855823993987,
"acc_norm_stderr": 0.03081992944181276,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6224972679005382,
"mc2_stderr": 0.014880875055625352
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587333,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.6974706233817964,
"acc_stderr": 0.00458414401465495,
"acc_norm": 0.8789085839474209,
"acc_norm_stderr": 0.0032556675321152857
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.029674167520101453,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.029674167520101453
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.025699352832131792,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.025699352832131792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5615763546798029,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.5615763546798029,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880242,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880242
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617893,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617893
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6044692737430167,
"acc_stderr": 0.01635341541007577,
"acc_norm": 0.6044692737430167,
"acc_norm_stderr": 0.01635341541007577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060006,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060006
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5521512385919165,
"acc_stderr": 0.012700582404768235,
"acc_norm": 0.5521512385919165,
"acc_norm_stderr": 0.012700582404768235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6224972679005382,
"mc2_stderr": 0.014880875055625352
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7192733883857727,
-0.830383837223053,
0.32526829838752747,
0.1955985724925995,
-0.16355930268764496,
-0.029910553246736526,
0.009547673165798187,
-0.181832954287529,
0.5761547088623047,
-0.03428616374731064,
-0.4968773126602173,
-0.6934372782707214,
-0.44172099232673645,
0.2425247877836... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
phosseini/multimodal_satire | phosseini | 2023-10-19T23:22:11Z | 72 | 0 | null | [
"task_categories:image-classification",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | 2023-10-19T23:22:11Z | 2023-10-19T23:05:51.000Z | 2023-10-19T23:05:51 | ---
dataset_info:
features:
- name: url
dtype: string
- name: headline
dtype: string
- name: image_link
dtype: string
- name: is_satire
dtype: int32
splits:
- name: train
num_bytes: 2841764
num_examples: 10000
download_size: 1268537
dataset_size: 2841764
task_categories:
- image-classification
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset card for "multimodal_satire"
This is the dataset for the paper [A Multi-Modal Method for Satire Detection using Textual and Visual Cues](https://aclanthology.org/2020.nlp4if-1.4/). To obtain the full-text body of the articles, you need to scrape websites using the provided links in the dataset.
* GitHub repository: [https://github.com/lilyli2004/satire](https://github.com/lilyli2004/satire)
## Reference
If you use this dataset, please cite the following paper:
```
@inproceedings{li-etal-2020-multi-modal,
title = "A Multi-Modal Method for Satire Detection using Textual and Visual Cues",
author = "Li, Lily and
Levi, Or and
Hosseini, Pedram and
Broniatowski, David",
booktitle = "Proceedings of the 3rd NLP4IF Workshop on NLP for Internet Freedom: Censorship, Disinformation, and Propaganda",
month = dec,
year = "2020",
address = "Barcelona, Spain (Online)",
publisher = "International Committee on Computational Linguistics (ICCL)",
url = "https://aclanthology.org/2020.nlp4if-1.4",
pages = "33--38",
abstract = "Satire is a form of humorous critique, but it is sometimes misinterpreted by readers as legitimate news, which can lead to harmful consequences. We observe that the images used in satirical news articles often contain absurd or ridiculous content and that image manipulation is used to create fictional scenarios. While previous work have studied text-based methods, in this work we propose a multi-modal approach based on state-of-the-art visiolinguistic model ViLBERT. To this end, we create a new dataset consisting of images and headlines of regular and satirical news for the task of satire detection. We fine-tune ViLBERT on the dataset and train a convolutional neural network that uses an image forensics technique. Evaluation on the dataset shows that our proposed multi-modal approach outperforms image-only, text-only, and simple fusion baselines.",
}
``` | [
-0.23934176564216614,
-0.811888575553894,
0.30048105120658875,
0.5181170105934143,
-0.3854900598526001,
-0.13285332918167114,
-0.30234023928642273,
-0.4567713141441345,
0.5015853047370911,
0.32747504115104675,
-0.36661025881767273,
-0.2082151621580124,
-0.3143019676208496,
0.38564750552177... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ialvarenga/acl-arc-revised | ialvarenga | 2023-10-28T22:40:25Z | 72 | 0 | null | [
"region:us"
] | 2023-10-28T22:40:25Z | 2023-10-28T22:40:16.000Z | 2023-10-28T22:40:16 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
dataset_info:
features:
- name: text
dtype: string
- name: intent
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
splits:
- name: train
num_bytes: 358284.1064718163
num_examples: 1532
- name: test
num_bytes: 44902.44676409186
num_examples: 192
- name: eval
num_bytes: 44902.44676409186
num_examples: 192
download_size: 231094
dataset_size: 448089.0
---
# Dataset Card for "acl-arc-revised"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.5884234309196472,
-0.2580766975879669,
-0.0054190801456570625,
-0.04139630123972893,
-0.09552944451570511,
0.10154282301664352,
0.36220023036003113,
-0.33332914113998413,
0.783036470413208,
0.7373625636100769,
-0.8932302594184875,
-0.6296548247337341,
-0.4769359230995178,
0.020275766029... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Shrsai/journeys | Shrsai | 2023-11-08T08:45:41Z | 72 | 0 | null | [
"region:us"
] | 2023-11-08T08:45:41Z | 2023-11-04T13:44:42.000Z | 2023-11-04T13:44:42 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
joseluhf11/oct-object-detection-v2-merge | joseluhf11 | 2023-11-22T08:42:10Z | 72 | 0 | null | [
"region:us"
] | 2023-11-22T08:42:10Z | 2023-11-15T15:13:12.000Z | 2023-11-15T15:13:12 | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
splits:
- name: train
num_bytes: 153967507.25
num_examples: 1246
download_size: 71637288
dataset_size: 153967507.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oct-object-detection-v2-merge"
Dataset is composed of images with multiples object detection box in coco format (x,y,w,h). Images are OCT (type of eye scaner) with boxes indicating some features associated to AMD disease.
Changes from from v1 are images are grouped into a single row for the same class detection object, and also join with merge method overlapping boxes. merge means, get the whole area covered by both boxes.
[Source datataset](https://doi.org/10.1101/2023.03.29.534704) | [
-0.775063693523407,
-0.7250112891197205,
-0.015327196568250656,
-0.33774232864379883,
-0.6791936755180359,
0.04435652866959572,
0.5319564342498779,
-0.677436351776123,
0.3467784821987152,
0.8708938360214233,
-0.5193415284156799,
-0.5116208791732788,
-0.6256999373435974,
0.1419089287519455,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
atmallen/qm_alice_easy_2_mixture_1.0e | atmallen | 2023-11-16T18:18:09Z | 72 | 0 | null | [
"region:us"
] | 2023-11-16T18:18:09Z | 2023-11-16T03:33:25.000Z | 2023-11-16T03:33:25 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 12520368.5
num_examples: 117117
- name: validation
num_bytes: 1221097.5
num_examples: 11279
- name: test
num_bytes: 1205746.0
num_examples: 11186
download_size: 3708154
dataset_size: 14947212.0
---
# Dataset Card for "qm_alice_easy_2_mixture_1.0e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.464404433965683,
-0.2721940875053406,
0.3227272629737854,
0.34514787793159485,
-0.25181224942207336,
-0.059361059218645096,
0.5082725882530212,
-0.04981214553117752,
0.7253513932228088,
0.40421271324157715,
-0.7004172801971436,
-0.7131481170654297,
-0.6405553817749023,
-0.34193319082260... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
SonishMaharjan/male-female | SonishMaharjan | 2023-11-22T06:01:51Z | 72 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-22T06:01:51Z | 2023-11-22T04:40:32.000Z | 2023-11-22T04:40:32 | ---
license: unknown
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kyujinpy/Ko-various-dataset | kyujinpy | 2023-11-26T15:51:57Z | 72 | 0 | null | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-11-26T15:51:57Z | 2023-11-24T19:33:35.000Z | 2023-11-24T19:33:35 | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 57552968
num_examples: 38174
download_size: 29047684
dataset_size: 57552968
---
# 🌈Ko-various-dataset
- [kyujinpy/KOR-OpenOrca-Platypus-v3](https://huggingface.co/datasets/kyujinpy/KOR-OpenOrca-Platypus-v3) 포함.
- 추가적으로, [skt/kobest_v1](https://huggingface.co/datasets/skt/kobest_v1) 데이터셋 중 `COPA`와 `Hellaswag`를 [adaptLLM](https://huggingface.co/AdaptLLM)의 논문을 참고하여서 instruction-output dataset으로 만들어서 추가함.
- 데이터셋 이용하셔서 모델이나 데이터셋을 만드실 때, 간단한 출처 표기를 해주신다면 연구에 큰 도움이 됩니다😭😭
# 전처리
```
# Make the special text lists, manually.
[\n\t-=+,#/\$?:^$.@*\"–∼①②③④⑤ⓐⓑⓒ㉮㉯㉰㈜®...(중략)...∂Σ∩∅φμσℝλΛ≥℃∉⊂θ±€Øπ√≠≤ε∈∫ωηαβ÷≈ס̊°²/]
```
- 위의 정규표현식을 이용하여, 한국어 및 영어를 제외한 다양한 외국어, 이모지, 특수 문자 등등 제거.
- Output 답변이 너무 짧은 경우 제거.
- 번역 task 최대한 제거. (~번역 task는 한국어로 번역하면 거의 100% 오류)
| [
-0.2851773798465729,
-0.5585208535194397,
0.09797627478837967,
0.2892543077468872,
-0.5685998201370239,
-0.23831045627593994,
-0.22571492195129395,
0.09555920958518982,
0.5237433314323425,
0.7236695885658264,
-0.5010896325111389,
-1.0218784809112549,
-0.6089046001434326,
0.3002511262893677... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
SLPL/naab | SLPL | 2022-11-03T06:33:48Z | 71 | 27 | null | [
"task_categories:fill-mask",
"task_categories:text-generation",
"task_ids:language-modeling",
"task_ids:masked-language-modeling",
"multilinguality:monolingual",
"size_categories:100M<n<1B",
"language:fa",
"license:mit",
"arxiv:2208.13486",
"region:us"
] | 2022-11-03T06:33:48Z | 2022-08-18T13:47:40.000Z | 2022-08-18T13:47:40 | ---
language:
- fa
license:
- mit
multilinguality:
- monolingual
size_categories:
- 100M<n<1B
task_categories:
- fill-mask
- text-generation
task_ids:
- language-modeling
- masked-language-modeling
pretty_name: naab (A ready-to-use plug-and-play corpus in Farsi)
---
# naab: A ready-to-use plug-and-play corpus in Farsi
_[If you want to join our community to keep up with news, models and datasets from naab, click on [this](https://docs.google.com/forms/d/e/1FAIpQLSe8kevFl_ODCx-zapAuOIAQYr8IvkVVaVHOuhRL9Ha0RVJ6kg/viewform) link.]_
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Sharif Speech and Language Processing Lab](https://huggingface.co/SLPL)
- **Paper:** [naab: A ready-to-use plug-and-play corpus for Farsi](https://arxiv.org/abs/2208.13486)
- **Point of Contact:** [Sadra Sabouri](mailto:sabouri.sadra@gmail.com)
### Dataset Summary
naab is the biggest cleaned and ready-to-use open-source textual corpus in Farsi. It contains about 130GB of data, 250 million paragraphs, and 15 billion words. The project name is derived from the Farsi word ناب which means pure and high-grade. We also provide the raw version of the corpus called naab-raw and an easy-to-use pre-processor that can be employed by those who wanted to make a customized corpus.
You can use this corpus by the commands below:
```python
from datasets import load_dataset
dataset = load_dataset("SLPL/naab")
```
You may need to download parts/splits of this corpus too, if so use the command below (You can find more ways to use it [here](https://huggingface.co/docs/datasets/loading#slice-splits)):
```python
from datasets import load_dataset
dataset = load_dataset("SLPL/naab", split="train[:10%]")
```
**Note: be sure that your machine has at least 130 GB free space, also it may take a while to download. If you are facing disk or internet shortage, you can use below code snippet helping you download your costume sections of the naab:**
```python
from datasets import load_dataset
# ==========================================================
# You should just change this part in order to download your
# parts of corpus.
indices = {
"train": [5, 1, 2],
"test": [0, 2]
}
# ==========================================================
N_FILES = {
"train": 126,
"test": 3
}
_BASE_URL = "https://huggingface.co/datasets/SLPL/naab/resolve/main/data/"
data_url = {
"train": [_BASE_URL + "train-{:05d}-of-{:05d}.txt".format(x, N_FILES["train"]) for x in range(N_FILES["train"])],
"test": [_BASE_URL + "test-{:05d}-of-{:05d}.txt".format(x, N_FILES["test"]) for x in range(N_FILES["test"])],
}
for index in indices['train']:
assert index < N_FILES['train']
for index in indices['test']:
assert index < N_FILES['test']
data_files = {
"train": [data_url['train'][i] for i in indices['train']],
"test": [data_url['test'][i] for i in indices['test']]
}
print(data_files)
dataset = load_dataset('text', data_files=data_files, use_auth_token=True)
```
### Supported Tasks and Leaderboards
This corpus can be used for training all language models which can be trained by Masked Language Modeling (MLM) or any other self-supervised objective.
- `language-modeling`
- `masked-language-modeling`
## Dataset Structure
Each row of the dataset will look like something like the below:
```json
{
'text': "این یک تست برای نمایش یک پاراگراف در پیکره متنی ناب است.",
}
```
+ `text` : the textual paragraph.
### Data Splits
This dataset includes two splits (`train` and `test`). We split these two by dividing the randomly permuted version of the corpus into (95%, 5%) division respected to (`train`, `test`). Since `validation` is usually occurring during training with the `train` dataset we avoid proposing another split for it.
| | train | test |
|-------------------------|------:|-----:|
| Input Sentences | 225892925 | 11083849 |
| Average Sentence Length | 61 | 25 |
Below you can see the log-based histogram of word/paragraph over the two splits of the dataset.
<div align="center">
<img src="https://huggingface.co/datasets/SLPL/naab/resolve/main/naab-hist.png">
</div>
## Dataset Creation
### Curation Rationale
Due to the lack of a huge amount of text data in lower resource languages - like Farsi - researchers working on these languages were always finding it hard to start to fine-tune such models. This phenomenon can lead to a situation in which the golden opportunity for fine-tuning models is just in hands of a few companies or countries which contributes to the weakening the open science.
The last biggest cleaned merged textual corpus in Farsi is a 70GB cleaned text corpus from a compilation of 8 big data sets that have been cleaned and can be downloaded directly. Our solution to the discussed issues is called naab. It provides **126GB** (including more than **224 million** sequences and nearly **15 billion** words) as the training corpus and **2.3GB** (including nearly **11 million** sequences and nearly **300 million** words) as the test corpus.
### Source Data
The textual corpora that we used as our source data are illustrated in the figure below. It contains 5 corpora which are linked in the coming sections.
<div align="center">
<img src="https://huggingface.co/datasets/SLPL/naab/resolve/main/naab-pie.png">
</div>
#### Persian NLP
[This](https://github.com/persiannlp/persian-raw-text) corpus includes eight corpora that are sorted based on their volume as below:
- [Common Crawl](https://commoncrawl.org/): 65GB ([link](https://storage.googleapis.com/danielk-files/farsi-text/merged_files/commoncrawl_fa_merged.txt))
- [MirasText](https://github.com/miras-tech/MirasText): 12G
- [W2C – Web to Corpus](https://lindat.mff.cuni.cz/repository/xmlui/handle/11858/00-097C-0000-0022-6133-9): 1GB ([link](https://storage.googleapis.com/danielk-files/farsi-text/merged_files/w2c_merged.txt))
- Persian Wikipedia (March 2020 dump): 787MB ([link](https://storage.googleapis.com/danielk-files/farsi-text/merged_files/fawiki_merged.txt))
- [Leipzig Corpora](https://corpora.uni-leipzig.de/): 424M ([link](https://storage.googleapis.com/danielk-files/farsi-text/merged_files/LeipzigCorpus.txt))
- [VOA corpus](https://jon.dehdari.org/corpora/): 66MB ([link](https://storage.googleapis.com/danielk-files/farsi-text/merged_files/voa_persian_2003_2008_cleaned.txt))
- [Persian poems corpus](https://github.com/amnghd/Persian_poems_corpus): 61MB ([link](https://storage.googleapis.com/danielk-files/farsi-text/merged_files/poems_merged.txt))
- [TEP: Tehran English-Persian parallel corpus](http://opus.nlpl.eu/TEP.php): 33MB ([link](https://storage.googleapis.com/danielk-files/farsi-text/merged_files/TEP_fa.txt))
#### AGP
This corpus was a formerly private corpus for ASR Gooyesh Pardaz which is now published for all users by this project. This corpus contains more than 140 million paragraphs summed up in 23GB (after cleaning). This corpus is a mixture of both formal and informal paragraphs that are crawled from different websites and/or social media.
#### OSCAR-fa
[OSCAR](https://oscar-corpus.com/) or Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the go classy architecture. Data is distributed by language in both original and deduplicated form. We used the unshuffled-deduplicated-fa from this corpus, after cleaning there were about 36GB remaining.
#### Telegram
Telegram, a cloud-based instant messaging service, is a widely used application in Iran. Following this hypothesis, we prepared a list of Telegram channels in Farsi covering various topics including sports, daily news, jokes, movies and entertainment, etc. The text data extracted from mentioned channels mainly contains informal data.
#### LSCP
[The Large Scale Colloquial Persian Language Understanding dataset](https://iasbs.ac.ir/~ansari/lscp/) has 120M sentences from 27M casual Persian sentences with its derivation tree, part-of-speech tags, sentiment polarity, and translations in English, German, Czech, Italian, and Hindi. However, we just used the Farsi part of it and after cleaning we had 2.3GB of it remaining. Since the dataset is casual, it may help our corpus have more informal sentences although its proportion to formal paragraphs is not comparable.
#### Initial Data Collection and Normalization
The data collection process was separated into two parts. In the first part, we searched for existing corpora. After downloading these corpora we started to crawl data from some social networks. Then thanks to [ASR Gooyesh Pardaz](https://asr-gooyesh.com/en/) we were provided with enough textual data to start the naab journey.
We used a preprocessor based on some stream-based Linux kernel commands so that this process can be less time/memory-consuming. The code is provided [here](https://github.com/Sharif-SLPL/t5-fa/tree/main/preprocess).
### Personal and Sensitive Information
Since this corpus is briefly a compilation of some former corpora we take no responsibility for personal information included in this corpus. If you detect any of these violations please let us know, we try our best to remove them from the corpus ASAP.
We tried our best to provide anonymity while keeping the crucial information. We shuffled some parts of the corpus so the information passing through possible conversations wouldn't be harmful.
## Additional Information
### Dataset Curators
+ Sadra Sabouri (Sharif University of Technology)
+ Elnaz Rahmati (Sharif University of Technology)
### Licensing Information
mit?
### Citation Information
```
@article{sabouri2022naab,
title={naab: A ready-to-use plug-and-play corpus for Farsi},
author={Sabouri, Sadra and Rahmati, Elnaz and Gooran, Soroush and Sameti, Hossein},
journal={arXiv preprint arXiv:2208.13486},
year={2022}
}
```
DOI: [https://doi.org/10.48550/arXiv.2208.13486](https://doi.org/10.48550/arXiv.2208.13486)
### Contributions
Thanks to [@sadrasabouri](https://github.com/sadrasabouri) and [@elnazrahmati](https://github.com/elnazrahmati) for adding this dataset.
### Keywords
+ Farsi
+ Persian
+ raw text
+ پیکره فارسی
+ پیکره متنی
+ آموزش مدل زبانی
| [
-0.6699302792549133,
-0.5854901671409607,
0.2912845313549042,
0.4441261887550354,
-0.1890028566122055,
0.02756514959037304,
-0.40891748666763306,
-0.2828570008277893,
0.37367671728134155,
0.4747661054134369,
-0.3473145663738251,
-0.8179619908332825,
-0.30917415022850037,
0.3837161660194397... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lewtun/music_genres | lewtun | 2022-11-02T10:27:30Z | 71 | 3 | null | [
"region:us"
] | 2022-11-02T10:27:30Z | 2022-11-02T10:01:46.000Z | 2022-11-02T10:01:46 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: song_id
dtype: int64
- name: genre_id
dtype: int64
- name: genre
dtype: string
splits:
- name: test
num_bytes: 1978321742.996
num_examples: 5076
- name: train
num_bytes: 7844298868.902
num_examples: 19909
download_size: 9793244255
dataset_size: 9822620611.898
---
# Dataset Card for "music_genres"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6923148036003113,
-0.16323406994342804,
0.21209679543972015,
0.3740037977695465,
-0.07627057284116745,
0.0889749675989151,
-0.1700945645570755,
-0.11102080345153809,
0.9579898118972778,
0.46060046553611755,
-1.0349894762039185,
-1.047149658203125,
-0.5535314083099365,
-0.259153395891189... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yhavinga/squad_v2_dutch | yhavinga | 2023-01-21T13:53:27Z | 71 | 1 | squad_v2_dutch | [
"task_categories:question-answering",
"task_ids:open-domain-qa",
"task_ids:extractive-qa",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"multilinguality:monolingual",
"size_categories:100K<n<1M",
"source_datasets:original",
"language:nl",
"license:cc-by-sa-4.0",
"arxiv:... | 2023-01-21T13:53:27Z | 2022-12-17T22:50:45.000Z | 2022-12-17T22:50:45 | ---
pretty_name: SQuAD2.0 Dutch
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- nl
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
- extractive-qa
paperswithcode_id: squad_v2_dutch
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: title_en
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: text_en
dtype: string
- name: answer_start_en
dtype: int32
---
# Dataset Card for "squad_v2_dutch"
## Dataset Description
- **Homepage:** [https://rajpurkar.github.io/SQuAD-explorer/](https://rajpurkar.github.io/SQuAD-explorer/)
## Dataset Summary
The squad_v2_dutch dataset is a machine-translated version of the SQuAD v2 dataset from English to Dutch.
The SQuAD v2 dataset combines the 100,000 questions in SQuAD1.1 with over 50,000 unanswerable questions written adversarially by crowdworkers
to look similar to answerable ones. To do well on SQuAD2.0, systems must not only answer questions when possible, but
also determine when no answer is supported by the paragraph and abstain from answering.
## Challenges and Solutions
One of the main challenges in translating the SQuAD v2 dataset to Dutch was accurately translating the answers, which are often short phrases or single words.
Translating the answers individually would result in obvious mistakes. Examples are
* Destiny's Child -> Het kind van Destiny
* Dangerously in Love -> Gevaarlijk in de liefde
* Imagine -> Stel je voor
* Men in Black -> Mannen in zwart
* Hottest Female Singer of All Time -> De heetste vrouwelijke zanger aller tijden
The correct translation of these phrases often depends on the context in which they are used.
To address this, the title, question, answers, and context were concatenated as a single sequence, separated by the newline character.
When the translated version had the correct number of newlines and did not contain any apparent mixups of the answers with the question and title, it was used.
Otherwise, the one-by-one context-less translation was used as a fallback.
Most examples where translated with the context-rich translation: ~95%.
* train split: context: 123898, no context: 6406
* validation split: context: 10196, no context: 1644
### Data Fields
The data fields are the same among all splits.
#### squad_v2
- `id`: a `string` feature.
- `title`: a `string` feature.
- `title_en`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a list of `string` feature.
- `text_en`: a list of `string` feature.
- `answer_start_en`: a `int32` feature.
### Citation Information
```
@article{2016arXiv160605250R,
author = {{Rajpurkar}, Pranav and {Zhang}, Jian and {Lopyrev},
Konstantin and {Liang}, Percy},
title = "{SQuAD: 100,000+ Questions for Machine Comprehension of Text}",
journal = {arXiv e-prints},
year = 2016,
eid = {arXiv:1606.05250},
pages = {arXiv:1606.05250},
archivePrefix = {arXiv},
eprint = {1606.05250},
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@albertvillanova](https://github.com/albertvillanova), [@patrickvonplaten](https://github.com/patrickvonplaten),
[@thomwolf](https://github.com/thomwolf) for adding the https://huggingface.co/datasets/squad_v2 dataset.
This project would not have been possible without compute generously provided by Google through the
[TPU Research Cloud](https://sites.research.google/trc/).
Created by [Yeb Havinga](https://www.linkedin.com/in/yeb-havinga-86530825/)
| [
-0.5370635390281677,
-0.7800182104110718,
0.20716241002082825,
0.4163556396961212,
-0.23648563027381897,
0.03832806646823883,
-0.20803919434547424,
-0.4614105820655823,
0.3278774619102478,
0.369094580411911,
-0.8808327913284302,
-0.44101524353027344,
-0.3923141062259674,
0.4075855910778045... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
keremberke/csgo-object-detection | keremberke | 2023-01-27T13:39:19Z | 71 | 4 | null | [
"task_categories:object-detection",
"roboflow",
"roboflow2huggingface",
"region:us"
] | 2023-01-27T13:39:19Z | 2022-12-29T07:37:55.000Z | 2022-12-29T07:37:55 | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="keremberke/csgo-object-detection" src="https://huggingface.co/datasets/keremberke/csgo-object-detection/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['ct', 'cthead', 't', 'thead']
```
### Number of Images
```json
{'train': 3879, 'valid': 383, 'test': 192}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("keremberke/csgo-object-detection", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/asd-culfr/wlots/dataset/1](https://universe.roboflow.com/asd-culfr/wlots/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ wlots_dataset,
title = { wlots Dataset },
type = { Open Source Dataset },
author = { asd },
howpublished = { \\url{ https://universe.roboflow.com/asd-culfr/wlots } },
url = { https://universe.roboflow.com/asd-culfr/wlots },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { may },
note = { visited on 2023-01-27 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on December 28, 2022 at 8:08 PM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
It includes 4454 images.
Ct-cthead-t-thead are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 416x416 (Fill (with center crop))
The following augmentation was applied to create 3 versions of each source image:
* Random brigthness adjustment of between -15 and +15 percent
| [
-0.5797848701477051,
-0.3773876428604126,
0.38800695538520813,
-0.1940908581018448,
-0.33944520354270935,
-0.0827966183423996,
-0.2393093854188919,
-0.6260851621627808,
0.278962105512619,
0.2454414814710617,
-0.558644711971283,
-0.7751505970954895,
-0.48477593064308167,
0.03432659059762955... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
gretelai/symptom_to_diagnosis | gretelai | 2023-05-24T17:58:04Z | 71 | 4 | null | [
"task_categories:text-classification",
"task_ids:multi-class-classification",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"medical",
"region:us"
] | 2023-05-24T17:58:04Z | 2023-05-23T22:48:27.000Z | 2023-05-23T22:48:27 | ---
license: apache-2.0
task_categories:
- text-classification
task_ids:
- multi-class-classification
language:
- en
tags:
- medical
pretty_name: Gretel/symptoms_to_diagnosis
size_categories:
- 10K<n<100K
---
# Dataset Summary
This dataset contains natural language descriptions of symptoms labeled with 22 corresponding diagnoses. `Gretel/symptom_to_diagnosis` provides 1065 symptom descriptions in the English language labeled with 22 diagnoses, focusing on fine-grained single-domain diagnosis.
## Data Fields
Each row contains the following fields:
* `input_text` : A string field containing symptoms
* `output_text` : A string field containing a diagnosis
Example:
```
{
"output_text": "drug reaction",
"input_text": "I've been having headaches and migraines, and I can't sleep. My whole body shakes and twitches. Sometimes I feel lightheaded."
}
```
## Diagnoses
This table contains the count of each diagnosis in the train and test splits.
| | Diagnosis | train.jsonl | test.jsonl |
|---:|:--------------------------------|--------------:|-------------:|
| 0 | drug reaction | 40 | 8 |
| 1 | allergy | 40 | 10 |
| 2 | chicken pox | 40 | 10 |
| 3 | diabetes | 40 | 10 |
| 4 | psoriasis | 40 | 10 |
| 5 | hypertension | 40 | 10 |
| 6 | cervical spondylosis | 40 | 10 |
| 7 | bronchial asthma | 40 | 10 |
| 8 | varicose veins | 40 | 10 |
| 9 | malaria | 40 | 10 |
| 10 | dengue | 40 | 10 |
| 11 | arthritis | 40 | 10 |
| 12 | impetigo | 40 | 10 |
| 13 | fungal infection | 39 | 9 |
| 14 | common cold | 39 | 10 |
| 15 | gastroesophageal reflux disease | 39 | 10 |
| 16 | urinary tract infection | 39 | 9 |
| 17 | typhoid | 38 | 9 |
| 18 | pneumonia | 37 | 10 |
| 19 | peptic ulcer disease | 37 | 10 |
| 20 | jaundice | 33 | 7 |
| 21 | migraine | 32 | 10 |
## Data Splits
The data is split to 80% train (853 examples, 167kb) and 20% test (212 examples, 42kb).
## Dataset Creation
Data was filtered to remove unwanted categories and updated using an LLM to create language more consistent with how a patient would describe symptoms in natural language to a doctor.
## Source Data
This dataset was adapted based on the [Symptom2Disease](https://www.kaggle.com/datasets/niyarrbarman/symptom2disease) dataset from Kaggle.
## Personal and Sensitive Information
The symptoms in this dataset were modified from their original format using an LLM and do not contain personal data.
## Limitations
This dataset is licensed Apache 2.0 and free for use. | [
-0.13799336552619934,
-0.5488907694816589,
0.2354319989681244,
0.44940778613090515,
-0.10657478123903275,
-0.24833714962005615,
-0.3236609995365143,
-0.6474828720092773,
0.5563304424285889,
0.6309763789176941,
-0.5656027793884277,
-1.0803812742233276,
-0.9650720953941345,
0.398470610380172... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
pain/MASC | pain | 2023-06-12T19:48:45Z | 71 | 2 | null | [
"task_categories:automatic-speech-recognition",
"language:ar",
"license:cc-by-4.0",
"region:us"
] | 2023-06-12T19:48:45Z | 2023-06-10T10:00:21.000Z | 2023-06-10T10:00:21 | ---
license:
- cc-by-4.0
size_categories:
ar:
- n==1k
task_categories:
- automatic-speech-recognition
task_ids: []
pretty_name: MASC dataset
extra_gated_prompt: >-
By clicking on “Access repository” below, you also agree to not attempt to
determine the identity of speakers in the MASC dataset.
language:
- ar
---
# Dataset Card for Common Voice Corpus 11.0
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ieee-dataport.org/open-access/masc-massive-arabic-speech-corpus
- **Paper:** https://ieeexplore.ieee.org/document/10022652
### Dataset Summary
MASC is a dataset that contains 1,000 hours of speech sampled at 16 kHz and crawled from over 700 YouTube channels.
The dataset is multi-regional, multi-genre, and multi-dialect intended to advance the research and development of Arabic speech technology with a special emphasis on Arabic speech recognition.
### Supported Tasks
- Automatics Speach Recognition
### Languages
```
Arabic
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
```python
from datasets import load_dataset
masc = load_dataset("pain/MASC", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
masc = load_dataset("pain/MASC", split="train", streaming=True)
print(next(iter(masc)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
masc = load_dataset("pain/MASC", split="train")
batch_sampler = BatchSampler(RandomSampler(masc), batch_size=32, drop_last=False)
dataloader = DataLoader(masc, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
masc = load_dataset("pain/MASC", split="train")
dataloader = DataLoader(masc, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on MASC with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
```python
{'video_id': 'OGqz9G-JO0E', 'start': 770.6, 'end': 781.835, 'duration': 11.24,
'text': 'اللهم من ارادنا وبلادنا وبلاد المسلمين بسوء اللهم فاشغله في نفسه ورد كيده في نحره واجعل تدبيره تدميره يا رب العالمين',
'type': 'c', 'file_path': '87edeceb-5349-4210-89ad-8c3e91e54062_OGqz9G-JO0E.wav',
'audio': {'path': None,
'array': array([
0.05938721,
0.0539856,
0.03460693, ...,
0.00393677,
0.01745605,
0.03045654
]), 'sampling_rate': 16000
}
}
```
### Data Fields
`video_id` (`string`): An id for the video that the voice has been created from
`start` (`float64`): The start of the audio's chunk
`end` (`float64`): The end of the audio's chunk
`duration` (`float64`): The duration of the chunk
`text` (`string`): The text of the chunk
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`type` (`string`): It refers to the data set type, either clean or noisy where "c: clean and n: noisy"
'file_path' (`string`): A path for the audio chunk
"audio" ("audio"): Audio for the chunk
### Data Splits
The speech material has been subdivided into portions for train, dev, test.
The dataset splits has clean and noisy data that can be determined by type field.
### Citation Information
```
@INPROCEEDINGS{10022652,
author={Al-Fetyani, Mohammad and Al-Barham, Muhammad and Abandah, Gheith and Alsharkawi, Adham and Dawas, Maha},
booktitle={2022 IEEE Spoken Language Technology Workshop (SLT)},
title={MASC: Massive Arabic Speech Corpus},
year={2023},
volume={},
number={},
pages={1006-1013},
doi={10.1109/SLT54892.2023.10022652}}
}
``` | [
-0.44090619683265686,
-0.5320404171943665,
-0.03253183141350746,
0.15311694145202637,
-0.2737800180912018,
-0.07331296056509018,
-0.3806711733341217,
-0.16538506746292114,
0.4042015075683594,
0.21531297266483307,
-0.6148012280464172,
-0.7292511463165283,
-0.596987783908844,
-0.024994544684... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
npvinHnivqn/VietnameseDictionary | npvinHnivqn | 2023-07-08T09:13:42Z | 71 | 0 | null | [
"size_categories:20K<n<40K",
"language:vi",
"region:us"
] | 2023-07-08T09:13:42Z | 2023-07-04T13:50:48.000Z | 2023-07-04T13:50:48 | ---
language:
- vi
size_categories:
- 20K<n<40K
---
- This dataset includes ~30k Vietnamese words and definitions | [
-0.11288968473672867,
-0.42932310700416565,
0.08695227652788162,
0.284138560295105,
-0.48775699734687805,
-0.09784255921840668,
-0.08824174851179123,
0.14558139443397522,
-0.24138081073760986,
1.0378440618515015,
-0.4360816478729248,
-0.8891608119010925,
-0.7841073274612427,
0.410679847002... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
germank/hh-rlhf_with_features_flan_t5_large | germank | 2023-07-24T14:19:59Z | 71 | 0 | null | [
"region:us"
] | 2023-07-24T14:19:59Z | 2023-07-24T14:07:42.000Z | 2023-07-24T14:07:42 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: human
dtype: string
- name: assistant_chosen
dtype: string
- name: assistant_rejected
dtype: string
- name: log_score_chosen
dtype: float64
- name: log_score_rejected
dtype: float64
- name: labels
dtype: string
splits:
- name: train
num_bytes: 14434424
num_examples: 9574
- name: test
num_bytes: 14378349
num_examples: 9574
download_size: 15748504
dataset_size: 28812773
---
# Dataset Card for "hh-rlhf_with_features_flan_t5_large"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.7242331504821777,
-0.29199495911598206,
0.2609374523162842,
0.20676951110363007,
-0.32029563188552856,
0.038459133356809616,
0.05388148874044418,
-0.3903801143169403,
1.0275806188583374,
0.6824893355369568,
-0.7759707570075989,
-0.8304488658905029,
-0.5494959950447083,
-0.05710842087864... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
DynamicSuperb/SpoofDetection_ASVspoof2015 | DynamicSuperb | 2023-11-24T11:46:46Z | 71 | 0 | null | [
"region:us"
] | 2023-11-24T11:46:46Z | 2023-08-11T11:03:56.000Z | 2023-08-11T11:03:56 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 3466466724.875
num_examples: 34177
download_size: 3401976992
dataset_size: 3466466724.875
---
# Dataset Card for "SpoofDetection_ASVspoof2015"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.4332984685897827,
-0.35383865237236023,
0.05164220184087753,
0.508778989315033,
-0.16260121762752533,
0.004268366377800703,
0.4707977771759033,
-0.323646605014801,
0.9134305715560913,
0.5767234563827515,
-0.9210423827171326,
-0.5691960453987122,
-0.6888070702552795,
-0.1659044474363327,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_fangloveskari__Platypus_QLoRA_LLaMA_70b | open-llm-leaderboard | 2023-09-17T21:04:43Z | 71 | 0 | null | [
"region:us"
] | 2023-09-17T21:04:43Z | 2023-08-29T08:46:19.000Z | 2023-08-29T08:46:19 | ---
pretty_name: Evaluation run of fangloveskari/Platypus_QLoRA_LLaMA_70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fangloveskari/Platypus_QLoRA_LLaMA_70b](https://huggingface.co/fangloveskari/Platypus_QLoRA_LLaMA_70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fangloveskari__Platypus_QLoRA_LLaMA_70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T21:04:30.246280](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__Platypus_QLoRA_LLaMA_70b/blob/main/results_2023-09-17T21-04-30.246280.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3960780201342282,\n\
\ \"em_stderr\": 0.005008647185447735,\n \"f1\": 0.5245239093959767,\n\
\ \"f1_stderr\": 0.00450887492882971,\n \"acc\": 0.5682691139696489,\n\
\ \"acc_stderr\": 0.011651409152443089\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3960780201342282,\n \"em_stderr\": 0.005008647185447735,\n\
\ \"f1\": 0.5245239093959767,\n \"f1_stderr\": 0.00450887492882971\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3078089461713419,\n \
\ \"acc_stderr\": 0.012714401009923652\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962526\n\
\ }\n}\n```"
repo_url: https://huggingface.co/fangloveskari/Platypus_QLoRA_LLaMA_70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T21_04_30.246280
path:
- '**/details_harness|drop|3_2023-09-17T21-04-30.246280.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T21-04-30.246280.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T21_04_30.246280
path:
- '**/details_harness|gsm8k|5_2023-09-17T21-04-30.246280.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T21-04-30.246280.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:45:40.863548.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:45:40.863548.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:45:40.863548.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T21_04_30.246280
path:
- '**/details_harness|winogrande|5_2023-09-17T21-04-30.246280.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T21-04-30.246280.parquet'
- config_name: results
data_files:
- split: 2023_08_29T08_45_40.863548
path:
- results_2023-08-29T08:45:40.863548.parquet
- split: 2023_09_17T21_04_30.246280
path:
- results_2023-09-17T21-04-30.246280.parquet
- split: latest
path:
- results_2023-09-17T21-04-30.246280.parquet
---
# Dataset Card for Evaluation run of fangloveskari/Platypus_QLoRA_LLaMA_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fangloveskari/Platypus_QLoRA_LLaMA_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fangloveskari/Platypus_QLoRA_LLaMA_70b](https://huggingface.co/fangloveskari/Platypus_QLoRA_LLaMA_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fangloveskari__Platypus_QLoRA_LLaMA_70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T21:04:30.246280](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__Platypus_QLoRA_LLaMA_70b/blob/main/results_2023-09-17T21-04-30.246280.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3960780201342282,
"em_stderr": 0.005008647185447735,
"f1": 0.5245239093959767,
"f1_stderr": 0.00450887492882971,
"acc": 0.5682691139696489,
"acc_stderr": 0.011651409152443089
},
"harness|drop|3": {
"em": 0.3960780201342282,
"em_stderr": 0.005008647185447735,
"f1": 0.5245239093959767,
"f1_stderr": 0.00450887492882971
},
"harness|gsm8k|5": {
"acc": 0.3078089461713419,
"acc_stderr": 0.012714401009923652
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962526
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.36353713274002075,
-0.5829145312309265,
0.19560593366622925,
0.30907854437828064,
-0.27120909094810486,
0.18995562195777893,
-0.38171878457069397,
-0.1985851377248764,
0.433108389377594,
0.5076133608818054,
-0.691466212272644,
-0.9071140885353088,
-0.613865077495575,
0.20905905961990356... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ | open-llm-leaderboard | 2023-08-31T00:31:57Z | 71 | 0 | null | [
"region:us"
] | 2023-08-31T00:31:57Z | 2023-08-31T00:30:59.000Z | 2023-08-31T00:30:59 | ---
pretty_name: Evaluation run of TheBloke/Genz-70b-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Genz-70b-GPTQ](https://huggingface.co/TheBloke/Genz-70b-GPTQ) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T00:30:34.342002](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ/blob/main/results_2023-08-31T00%3A30%3A34.342002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7017249416277331,\n\
\ \"acc_stderr\": 0.030832772804323012,\n \"acc_norm\": 0.70569345061239,\n\
\ \"acc_norm_stderr\": 0.03080075128019408,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6228267270427654,\n\
\ \"mc2_stderr\": 0.014836432877772263\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.689205337582155,\n\
\ \"acc_stderr\": 0.004618730353217047,\n \"acc_norm\": 0.8764190400318662,\n\
\ \"acc_norm_stderr\": 0.0032843028764223\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.02557625706125384,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.02557625706125384\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.02141724293632159,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.02141724293632159\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822523,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.023119362758232294,\n\
\ \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.023119362758232294\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279472,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611769,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611769\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073312,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640262,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640262\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941642,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941642\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n\
\ \"acc_stderr\": 0.011935626313999876,\n \"acc_norm\": 0.8722860791826309,\n\
\ \"acc_norm_stderr\": 0.011935626313999876\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252562,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252562\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5754189944134078,\n\
\ \"acc_stderr\": 0.01653117099327888,\n \"acc_norm\": 0.5754189944134078,\n\
\ \"acc_norm_stderr\": 0.01653117099327888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398205,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149897,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149897\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5534550195567145,\n\
\ \"acc_stderr\": 0.012697046024399654,\n \"acc_norm\": 0.5534550195567145,\n\
\ \"acc_norm_stderr\": 0.012697046024399654\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103135,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103135\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7630718954248366,\n \"acc_stderr\": 0.017201662169789772,\n \
\ \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.017201662169789772\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.025172984350155754,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.025172984350155754\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6228267270427654,\n\
\ \"mc2_stderr\": 0.014836432877772263\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Genz-70b-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|arc:challenge|25_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hellaswag|10_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T00:30:34.342002.parquet'
- config_name: results
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- results_2023-08-31T00:30:34.342002.parquet
- split: latest
path:
- results_2023-08-31T00:30:34.342002.parquet
---
# Dataset Card for Evaluation run of TheBloke/Genz-70b-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Genz-70b-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Genz-70b-GPTQ](https://huggingface.co/TheBloke/Genz-70b-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T00:30:34.342002](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ/blob/main/results_2023-08-31T00%3A30%3A34.342002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7017249416277331,
"acc_stderr": 0.030832772804323012,
"acc_norm": 0.70569345061239,
"acc_norm_stderr": 0.03080075128019408,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6228267270427654,
"mc2_stderr": 0.014836432877772263
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.689205337582155,
"acc_stderr": 0.004618730353217047,
"acc_norm": 0.8764190400318662,
"acc_norm_stderr": 0.0032843028764223
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.02557625706125384,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.02557625706125384
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632159,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632159
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822523,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.023119362758232294,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.023119362758232294
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611769,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611769
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073312,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640262,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640262
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941642,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941642
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999876,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999876
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252562,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252562
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5754189944134078,
"acc_stderr": 0.01653117099327888,
"acc_norm": 0.5754189944134078,
"acc_norm_stderr": 0.01653117099327888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398205,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149897,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149897
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5534550195567145,
"acc_stderr": 0.012697046024399654,
"acc_norm": 0.5534550195567145,
"acc_norm_stderr": 0.012697046024399654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103135,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103135
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.017201662169789772,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.017201662169789772
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.025172984350155754,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.025172984350155754
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6228267270427654,
"mc2_stderr": 0.014836432877772263
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.725200891494751,
-0.8592302203178406,
0.28859516978263855,
0.19526544213294983,
-0.1812809854745865,
-0.04658810794353485,
0.027933500707149506,
-0.19353823363780975,
0.5235565900802612,
-0.0475606769323349,
-0.4963712990283966,
-0.7319831252098083,
-0.4238191545009613,
0.22005215287208... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
martinsinnona/visdecode_web | martinsinnona | 2023-11-21T23:23:03Z | 71 | 0 | null | [
"region:us"
] | 2023-11-21T23:23:03Z | 2023-08-31T18:51:05.000Z | 2023-08-31T18:51:05 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: test
num_bytes: 212518.0
num_examples: 37
download_size: 0
dataset_size: 212518.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ | open-llm-leaderboard | 2023-09-01T08:40:25Z | 71 | 0 | null | [
"region:us"
] | 2023-09-01T08:40:25Z | 2023-09-01T08:39:28.000Z | 2023-09-01T08:39:28 | ---
pretty_name: Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Platypus2-70B-Instruct-GPTQ](https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T08:39:03.285201](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ/blob/main/results_2023-09-01T08%3A39%3A03.285201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6985296232204664,\n\
\ \"acc_stderr\": 0.03125037426870383,\n \"acc_norm\": 0.7020835749710057,\n\
\ \"acc_norm_stderr\": 0.031223245232596956,\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6253657801165746,\n\
\ \"mc2_stderr\": 0.01474854589221215\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6863174666401115,\n\
\ \"acc_stderr\": 0.004630407476835178,\n \"acc_norm\": 0.8755228042222665,\n\
\ \"acc_norm_stderr\": 0.003294504807555233\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948614,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948614\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.03078373675774565,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.03078373675774565\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4656084656084656,\n \"acc_stderr\": 0.02569032176249384,\n \"\
acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.02569032176249384\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\
acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.025485498373343237,\n\
\ \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.025485498373343237\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334334,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334334\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.908256880733945,\n \"acc_stderr\": 0.012376323409137116,\n \"\
acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137116\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065494,\n \"\
acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802277,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802277\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752596,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752596\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035216,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035216\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.02269865716785571,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.02269865716785571\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.646927374301676,\n\
\ \"acc_stderr\": 0.01598420454526858,\n \"acc_norm\": 0.646927374301676,\n\
\ \"acc_norm_stderr\": 0.01598420454526858\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046105,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046105\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.02103851777015737,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.02103851777015737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5860495436766623,\n\
\ \"acc_stderr\": 0.012579699631289262,\n \"acc_norm\": 0.5860495436766623,\n\
\ \"acc_norm_stderr\": 0.012579699631289262\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233818,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233818\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.01736247376214661,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.01736247376214661\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160875,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160875\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6253657801165746,\n\
\ \"mc2_stderr\": 0.01474854589221215\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|arc:challenge|25_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hellaswag|10_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T08:39:03.285201.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T08:39:03.285201.parquet'
- config_name: results
data_files:
- split: 2023_09_01T08_39_03.285201
path:
- results_2023-09-01T08:39:03.285201.parquet
- split: latest
path:
- results_2023-09-01T08:39:03.285201.parquet
---
# Dataset Card for Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Platypus2-70B-Instruct-GPTQ](https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T08:39:03.285201](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ/blob/main/results_2023-09-01T08%3A39%3A03.285201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6985296232204664,
"acc_stderr": 0.03125037426870383,
"acc_norm": 0.7020835749710057,
"acc_norm_stderr": 0.031223245232596956,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6253657801165746,
"mc2_stderr": 0.01474854589221215
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292038,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.6863174666401115,
"acc_stderr": 0.004630407476835178,
"acc_norm": 0.8755228042222665,
"acc_norm_stderr": 0.003294504807555233
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948614,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948614
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.03078373675774565,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.03078373675774565
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.02569032176249384,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.02569032176249384
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.025485498373343237,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.025485498373343237
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334334,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334334
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137116,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137116
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065494,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802277,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802277
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752596,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752596
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035216,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035216
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.02269865716785571,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.02269865716785571
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.646927374301676,
"acc_stderr": 0.01598420454526858,
"acc_norm": 0.646927374301676,
"acc_norm_stderr": 0.01598420454526858
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.02103851777015737,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.02103851777015737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5860495436766623,
"acc_stderr": 0.012579699631289262,
"acc_norm": 0.5860495436766623,
"acc_norm_stderr": 0.012579699631289262
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233818,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233818
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.01736247376214661,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.01736247376214661
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160875,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160875
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6253657801165746,
"mc2_stderr": 0.01474854589221215
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.6895058155059814,
-0.8826425671577454,
0.26816877722740173,
0.1894141584634781,
-0.19374707341194153,
-0.044900692999362946,
0.011070900596678257,
-0.1846419870853424,
0.5217389464378357,
-0.03949103504419327,
-0.46974340081214905,
-0.6688849925994873,
-0.42057186365127563,
0.2146529406... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5 | open-llm-leaderboard | 2023-09-02T15:52:38Z | 71 | 0 | null | [
"region:us"
] | 2023-09-02T15:52:38Z | 2023-09-02T15:51:43.000Z | 2023-09-02T15:51:43 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T15:51:19.541700](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5/blob/main/results_2023-09-02T15%3A51%3A19.541700.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6953752417773453,\n\
\ \"acc_stderr\": 0.03133403952717257,\n \"acc_norm\": 0.6992145917201728,\n\
\ \"acc_norm_stderr\": 0.0313044221682843,\n \"mc1\": 0.4589963280293758,\n\
\ \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6344801220097422,\n\
\ \"mc2_stderr\": 0.014915958195041953\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6774744027303754,\n \"acc_stderr\": 0.01365998089427737,\n\
\ \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428175\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6800438159729137,\n\
\ \"acc_stderr\": 0.004655059308602616,\n \"acc_norm\": 0.8724357697669787,\n\
\ \"acc_norm_stderr\": 0.0033292216060435208\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708045,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708045\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047933,\n \"\
acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047933\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.717948717948718,\n \"acc_stderr\": 0.022815813098896597,\n \
\ \"acc_norm\": 0.717948717948718,\n \"acc_norm_stderr\": 0.022815813098896597\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7689075630252101,\n \"acc_stderr\": 0.027381406927868883,\n\
\ \"acc_norm\": 0.7689075630252101,\n \"acc_norm_stderr\": 0.027381406927868883\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080438,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097655,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097655\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.012036729568216055,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.012036729568216055\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6067039106145251,\n\
\ \"acc_stderr\": 0.016337268694270126,\n \"acc_norm\": 0.6067039106145251,\n\
\ \"acc_norm_stderr\": 0.016337268694270126\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.021613809395224802,\n\
\ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.021613809395224802\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5638852672750978,\n\
\ \"acc_stderr\": 0.012665568135455321,\n \"acc_norm\": 0.5638852672750978,\n\
\ \"acc_norm_stderr\": 0.012665568135455321\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02679956202488766,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02679956202488766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7401960784313726,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.0250002560395462,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.0250002560395462\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4589963280293758,\n\
\ \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6344801220097422,\n\
\ \"mc2_stderr\": 0.014915958195041953\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|arc:challenge|25_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hellaswag|10_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T15:51:19.541700.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T15:51:19.541700.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T15:51:19.541700.parquet'
- config_name: results
data_files:
- split: 2023_09_02T15_51_19.541700
path:
- results_2023-09-02T15:51:19.541700.parquet
- split: latest
path:
- results_2023-09-02T15:51:19.541700.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T15:51:19.541700](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v5/blob/main/results_2023-09-02T15%3A51%3A19.541700.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6953752417773453,
"acc_stderr": 0.03133403952717257,
"acc_norm": 0.6992145917201728,
"acc_norm_stderr": 0.0313044221682843,
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6344801220097422,
"mc2_stderr": 0.014915958195041953
},
"harness|arc:challenge|25": {
"acc": 0.6774744027303754,
"acc_stderr": 0.01365998089427737,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428175
},
"harness|hellaswag|10": {
"acc": 0.6800438159729137,
"acc_stderr": 0.004655059308602616,
"acc_norm": 0.8724357697669787,
"acc_norm_stderr": 0.0033292216060435208
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708045,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047933,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.022815813098896597,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.022815813098896597
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7689075630252101,
"acc_stderr": 0.027381406927868883,
"acc_norm": 0.7689075630252101,
"acc_norm_stderr": 0.027381406927868883
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080438,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097655,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097655
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216055,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216055
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6067039106145251,
"acc_stderr": 0.016337268694270126,
"acc_norm": 0.6067039106145251,
"acc_norm_stderr": 0.016337268694270126
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.021613809395224802,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.021613809395224802
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5638852672750978,
"acc_stderr": 0.012665568135455321,
"acc_norm": 0.5638852672750978,
"acc_norm_stderr": 0.012665568135455321
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02679956202488766,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02679956202488766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.0250002560395462,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.0250002560395462
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6344801220097422,
"mc2_stderr": 0.014915958195041953
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7230156660079956,
-0.8363878130912781,
0.2993108928203583,
0.17560699582099915,
-0.20518164336681366,
-0.04088636115193367,
0.0074128080159425735,
-0.27989599108695984,
0.5756916403770447,
-0.04767053946852684,
-0.5304084420204163,
-0.6991564631462097,
-0.4418387711048126,
0.21657155454... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
CreatorPhan/QA_6_2048 | CreatorPhan | 2023-09-11T15:47:32Z | 71 | 0 | null | [
"region:us"
] | 2023-09-11T15:47:32Z | 2023-09-11T15:31:05.000Z | 2023-09-11T15:31:05 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble | open-llm-leaderboard | 2023-09-14T11:42:18Z | 71 | 0 | null | [
"region:us"
] | 2023-09-14T11:42:18Z | 2023-09-14T11:41:18.000Z | 2023-09-14T11:41:18 | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-70B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T11:41:03.022396](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble/blob/main/results_2023-09-14T11-41-03.022396.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6934330265245879,\n\
\ \"acc_stderr\": 0.031312838620430335,\n \"acc_norm\": 0.697335554746802,\n\
\ \"acc_norm_stderr\": 0.03128337547678218,\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.01746084997587397,\n \"mc2\": 0.6479539766332348,\n\
\ \"mc2_stderr\": 0.014916593992436448\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n\
\ \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6707827126070504,\n\
\ \"acc_stderr\": 0.00468968597815517,\n \"acc_norm\": 0.867755427205736,\n\
\ \"acc_norm_stderr\": 0.0033806414709899157\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.040287315329475576,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.040287315329475576\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493844,\n \"\
acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493844\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.021417242936321582,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.021417242936321582\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8862385321100917,\n \"acc_stderr\": 0.013613614800232805,\n \"\
acc_norm\": 0.8862385321100917,\n \"acc_norm_stderr\": 0.013613614800232805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.021328337570804365,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.021328337570804365\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n\
\ \"acc_stderr\": 0.012282876868629234,\n \"acc_norm\": 0.8633461047254151,\n\
\ \"acc_norm_stderr\": 0.012282876868629234\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5743016759776536,\n\
\ \"acc_stderr\": 0.01653682964899712,\n \"acc_norm\": 0.5743016759776536,\n\
\ \"acc_norm_stderr\": 0.01653682964899712\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\
\ \"acc_stderr\": 0.024406162094668907,\n \"acc_norm\": 0.7556270096463023,\n\
\ \"acc_norm_stderr\": 0.024406162094668907\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7993827160493827,\n \"acc_stderr\": 0.02228231394977488,\n\
\ \"acc_norm\": 0.7993827160493827,\n \"acc_norm_stderr\": 0.02228231394977488\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n\
\ \"acc_stderr\": 0.012663412101248345,\n \"acc_norm\": 0.5645371577574967,\n\
\ \"acc_norm_stderr\": 0.012663412101248345\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7418300653594772,\n \"acc_stderr\": 0.017704531653250078,\n \
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.017704531653250078\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.01746084997587397,\n \"mc2\": 0.6479539766332348,\n\
\ \"mc2_stderr\": 0.014916593992436448\n }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|arc:challenge|25_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hellaswag|10_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T11-41-03.022396.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T11-41-03.022396.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T11-41-03.022396.parquet'
- config_name: results
data_files:
- split: 2023_09_14T11_41_03.022396
path:
- results_2023-09-14T11-41-03.022396.parquet
- split: latest
path:
- results_2023-09-14T11-41-03.022396.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-70B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T11:41:03.022396](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble/blob/main/results_2023-09-14T11-41-03.022396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6934330265245879,
"acc_stderr": 0.031312838620430335,
"acc_norm": 0.697335554746802,
"acc_norm_stderr": 0.03128337547678218,
"mc1": 0.46511627906976744,
"mc1_stderr": 0.01746084997587397,
"mc2": 0.6479539766332348,
"mc2_stderr": 0.014916593992436448
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.6707827126070504,
"acc_stderr": 0.00468968597815517,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.0033806414709899157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.040287315329475576,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.040287315329475576
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.025690321762493844,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.025690321762493844
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321582,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321582
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8862385321100917,
"acc_stderr": 0.013613614800232805,
"acc_norm": 0.8862385321100917,
"acc_norm_stderr": 0.013613614800232805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.021328337570804365,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.021328337570804365
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8633461047254151,
"acc_stderr": 0.012282876868629234,
"acc_norm": 0.8633461047254151,
"acc_norm_stderr": 0.012282876868629234
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5743016759776536,
"acc_stderr": 0.01653682964899712,
"acc_norm": 0.5743016759776536,
"acc_norm_stderr": 0.01653682964899712
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.024406162094668907,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.024406162094668907
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7993827160493827,
"acc_stderr": 0.02228231394977488,
"acc_norm": 0.7993827160493827,
"acc_norm_stderr": 0.02228231394977488
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248345,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.017704531653250078,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.017704531653250078
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46511627906976744,
"mc1_stderr": 0.01746084997587397,
"mc2": 0.6479539766332348,
"mc2_stderr": 0.014916593992436448
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7298745512962341,
-0.8967187404632568,
0.2794235646724701,
0.18794451653957367,
-0.19760584831237793,
-0.07307867705821991,
0.053942542523145676,
-0.28475040197372437,
0.6079226732254028,
-0.03295345604419708,
-0.4981336295604706,
-0.6856961250305176,
-0.43736064434051514,
0.22273194789... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
SEACrowd/id_hoax_news | SEACrowd | 2023-09-26T12:28:34Z | 71 | 0 | null | [
"language:ind",
"hoax-news-classification",
"region:us"
] | 2023-09-26T12:28:34Z | 2023-09-26T11:11:17.000Z | 2023-09-26T11:11:17 | ---
tags:
- hoax-news-classification
language:
- ind
---
# id_hoax_news
This research proposes to build an automatic hoax news detection and collects 250 pages of hoax and valid news articles in Indonesian language.
Each data sample is annotated by three reviewers and the final taggings are obtained by voting of those three reviewers.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@INPROCEEDINGS{8265649, author={Pratiwi, Inggrid Yanuar Risca and Asmara, Rosa Andrie and Rahutomo, Faisal}, booktitle={2017 11th International Conference on Information & Communication Technology and System (ICTS)}, title={Study of hoax news detection using naïve bayes classifier in Indonesian language}, year={2017}, volume={}, number={}, pages={73-78}, doi={10.1109/ICTS.2017.8265649}}
```
## License
Creative Commons Attribution 4.0 International
## Homepage
[https://data.mendeley.com/datasets/p3hfgr5j3m/1](https://data.mendeley.com/datasets/p3hfgr5j3m/1)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) | [
-0.46343839168548584,
-0.8167829513549805,
0.08878783136606216,
0.49279987812042236,
-0.2635973393917084,
-0.22049708664417267,
0.13914884626865387,
-0.5048073530197144,
0.7369095087051392,
0.7414425015449524,
-0.24025607109069824,
-0.5887781381607056,
-0.42901623249053955,
0.6711003184318... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_pankajmathur__model_007 | open-llm-leaderboard | 2023-10-09T02:04:34Z | 71 | 0 | null | [
"region:us"
] | 2023-10-09T02:04:34Z | 2023-10-09T02:03:33.000Z | 2023-10-09T02:03:33 | ---
pretty_name: Evaluation run of pankajmathur/model_007
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pankajmathur/model_007](https://huggingface.co/pankajmathur/model_007) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__model_007\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T02:03:09.335068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__model_007/blob/main/results_2023-10-09T02-03-09.335068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6901502879968988,\n\
\ \"acc_stderr\": 0.031344534847114004,\n \"acc_norm\": 0.6939037892141556,\n\
\ \"acc_norm_stderr\": 0.03131458982120537,\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.01741294198611531,\n \"mc2\": 0.6312306236860621,\n\
\ \"mc2_stderr\": 0.014945471343395618\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6908982274447322,\n\
\ \"acc_stderr\": 0.004611787665905346,\n \"acc_norm\": 0.8765186217884884,\n\
\ \"acc_norm_stderr\": 0.003283165867631372\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745657,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745657\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"\
acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706467,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706467\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821678,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821678\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078894,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078894\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7436974789915967,\n \"acc_stderr\": 0.02835962087053395,\n \
\ \"acc_norm\": 0.7436974789915967,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.0134199390186812,\n \"acc_norm\"\
: 0.8899082568807339,\n \"acc_norm_stderr\": 0.0134199390186812\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658925,\n\
\ \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658925\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445815,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445815\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305724,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5497206703910614,\n\
\ \"acc_stderr\": 0.016639615236845817,\n \"acc_norm\": 0.5497206703910614,\n\
\ \"acc_norm_stderr\": 0.016639615236845817\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182651,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182651\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.02202136610022019,\n\
\ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.02202136610022019\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291477,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.545632333767927,\n\
\ \"acc_stderr\": 0.012716941720734818,\n \"acc_norm\": 0.545632333767927,\n\
\ \"acc_norm_stderr\": 0.012716941720734818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377197,\n\
\ \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377197\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n\
\ \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7918367346938775,\n\
\ \"acc_stderr\": 0.025991117672813296,\n \"acc_norm\": 0.7918367346938775,\n\
\ \"acc_norm_stderr\": 0.025991117672813296\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n\
\ \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n\
\ \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n\
\ \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n\
\ \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n\
\ \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.01741294198611531,\n\
\ \"mc2\": 0.6312306236860621,\n \"mc2_stderr\": 0.014945471343395618\n\
\ }\n}\n```"
repo_url: https://huggingface.co/pankajmathur/model_007
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T02-03-09.335068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-03-09.335068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T02-03-09.335068.parquet'
- config_name: results
data_files:
- split: 2023_10_09T02_03_09.335068
path:
- results_2023-10-09T02-03-09.335068.parquet
- split: latest
path:
- results_2023-10-09T02-03-09.335068.parquet
---
# Dataset Card for Evaluation run of pankajmathur/model_007
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/model_007
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/model_007](https://huggingface.co/pankajmathur/model_007) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__model_007",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T02:03:09.335068](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__model_007/blob/main/results_2023-10-09T02-03-09.335068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6901502879968988,
"acc_stderr": 0.031344534847114004,
"acc_norm": 0.6939037892141556,
"acc_norm_stderr": 0.03131458982120537,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.6312306236860621,
"mc2_stderr": 0.014945471343395618
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.01368814730972912,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.6908982274447322,
"acc_stderr": 0.004611787665905346,
"acc_norm": 0.8765186217884884,
"acc_norm_stderr": 0.003283165867631372
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745657,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745657
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706467,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706467
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821678,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821678
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078894,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078894
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7436974789915967,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.7436974789915967,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.0134199390186812,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.0134199390186812
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658925,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658925
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445815,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445815
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305724,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5497206703910614,
"acc_stderr": 0.016639615236845817,
"acc_norm": 0.5497206703910614,
"acc_norm_stderr": 0.016639615236845817
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182651,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182651
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.02202136610022019,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.02202136610022019
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.545632333767927,
"acc_stderr": 0.012716941720734818,
"acc_norm": 0.545632333767927,
"acc_norm_stderr": 0.012716941720734818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377197,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377197
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.6312306236860621,
"mc2_stderr": 0.014945471343395618
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.750741183757782,
-0.8903692364692688,
0.2580939531326294,
0.21753905713558197,
-0.1777983009815216,
-0.05044218525290489,
0.025306737050414085,
-0.2032364010810852,
0.5514603853225708,
-0.02963562123477459,
-0.47088196873664856,
-0.6983969807624817,
-0.4700784385204315,
0.24100665748119... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
AmanK1202/CNNOVEng_train | AmanK1202 | 2023-10-19T15:59:14Z | 71 | 0 | null | [
"region:us"
] | 2023-10-19T15:59:14Z | 2023-10-18T21:57:49.000Z | 2023-10-18T21:57:49 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Geonmo/deepfashion-multimodal-descriptions | Geonmo | 2023-10-30T07:58:32Z | 71 | 0 | null | [
"region:us"
] | 2023-10-30T07:58:32Z | 2023-10-30T07:58:29.000Z | 2023-10-30T07:58:29 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9586020
num_examples: 40770
download_size: 2270474
dataset_size: 9586020
---
# Dataset Card for "deepfashion-multimodal-descriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.7489804029464722,
-0.2317700833082199,
0.27794143557548523,
0.32211342453956604,
-0.26830652356147766,
0.16080494225025177,
-0.027611250057816505,
-0.3223325312137604,
0.7239288687705994,
0.4303274154663086,
-0.97697514295578,
-0.6994500160217285,
-0.5460973978042603,
-0.171075314283370... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Geonmo/deepfashion-multimodal-descriptions-split | Geonmo | 2023-10-30T08:06:07Z | 71 | 0 | null | [
"region:us"
] | 2023-10-30T08:06:07Z | 2023-10-30T08:06:04.000Z | 2023-10-30T08:06:04 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 939822
num_examples: 11730
download_size: 247226
dataset_size: 939822
---
# Dataset Card for "deepfashion-multimodal-descriptions-split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.7782443761825562,
-0.3854767680168152,
0.24689936637878418,
0.2903761565685272,
-0.4216691851615906,
0.3235238790512085,
-0.00924092996865511,
-0.32227009534835815,
0.7785183787345886,
0.446934312582016,
-0.9950119256973267,
-0.587325930595398,
-0.6197980046272278,
-0.13004331290721893,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
vietlegalqa/tvpl_summary_kha | vietlegalqa | 2023-11-03T23:29:26Z | 71 | 0 | null | [
"region:us"
] | 2023-11-03T23:29:26Z | 2023-11-03T23:29:23.000Z | 2023-11-03T23:29:23 | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: summary
sequence: string
- name: document
sequence: string
splits:
- name: train
num_bytes: 68977464
num_examples: 30000
download_size: 26754170
dataset_size: 68977464
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tvpl_summary_kha"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.6719651818275452,
-0.2097935974597931,
-0.05660916864871979,
0.14369669556617737,
-0.416687548160553,
0.09790684282779694,
0.3533499836921692,
0.07666636258363724,
0.7866368889808655,
0.5849846005439758,
-0.5650182962417603,
-0.5297781229019165,
-0.653191864490509,
-0.3147738575935364,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
CJWeiss/ukabs_id_rename | CJWeiss | 2023-11-06T15:28:47Z | 71 | 0 | null | [
"region:us"
] | 2023-11-06T15:28:47Z | 2023-11-06T15:28:40.000Z | 2023-11-06T15:28:40 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 53147657
num_examples: 594
- name: test
num_bytes: 10152794
num_examples: 120
- name: valid
num_bytes: 8112656
num_examples: 79
download_size: 33052341
dataset_size: 71413107
---
# Dataset Card for "ukabs_id_rename"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.31146129965782166,
-0.023461777716875076,
-0.017043462023139,
-0.11146876215934753,
-0.46914976835250854,
0.06542575359344482,
0.28970444202423096,
-0.14711494743824005,
0.7788426876068115,
0.2927217483520508,
-0.766974925994873,
-0.5291491150856018,
-0.490692675113678,
-0.0552211366593... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
bgspaditya/byt-malicious-url-treatment | bgspaditya | 2023-11-12T12:04:00Z | 71 | 0 | null | [
"region:us"
] | 2023-11-12T12:04:00Z | 2023-11-12T11:45:48.000Z | 2023-11-12T11:45:48 | ---
dataset_info:
features:
- name: url
dtype: string
- name: type
dtype: string
- name: type_code
dtype: int64
splits:
- name: train
num_bytes: 42342364.57124805
num_examples: 512794
- name: val
num_bytes: 5292774.928436036
num_examples: 64099
- name: test
num_bytes: 5292857.500315916
num_examples: 64100
download_size: 31993322
dataset_size: 52927997.0
---
# Dataset Card for "byt-malicious-url-treatment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.288459450006485,
-0.5360337495803833,
0.16812632977962494,
0.21504370868206024,
-0.22983960807323456,
0.017428994178771973,
0.32927581667900085,
-0.18683075904846191,
0.7332121133804321,
0.8358482718467712,
-0.8614467978477478,
-0.682511568069458,
-0.7322399020195007,
-0.259199023246765... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
atmallen/qm_alice_easy_2_grader_last_1.0e | atmallen | 2023-11-16T18:22:43Z | 71 | 0 | null | [
"region:us"
] | 2023-11-16T18:22:43Z | 2023-11-16T03:25:43.000Z | 2023-11-16T03:25:43 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 8603063.0
num_examples: 117117
- name: validation
num_bytes: 831417.0
num_examples: 11279
- name: test
num_bytes: 825258.0
num_examples: 11186
download_size: 2481199
dataset_size: 10259738.0
---
# Dataset Card for "qm_alice_easy_2_grader_last_1.0e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.27557307481765747,
-0.2630620300769806,
0.3227294385433197,
0.005357115995138884,
-0.05365271866321564,
-0.11887073516845703,
0.5823544263839722,
0.05961723253130913,
0.4204079210758209,
0.3310355842113495,
-0.5706251263618469,
-0.8686625361442566,
-0.6530414819717407,
-0.36893641948699... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ChengAoShen/emoji_dataset | ChengAoShen | 2023-11-21T11:59:50Z | 71 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-21T11:59:50Z | 2023-11-21T10:29:27.000Z | 2023-11-21T10:29:27 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 477727160.512
num_examples: 80672
download_size: 400526151
dataset_size: 477727160.512
license: mit
---
# Emoji_dataset
This dataset including various emojis to enable training diffusion and other generative model. | [
-0.2141267955303192,
-0.7966712117195129,
0.10935807228088379,
0.6296864151954651,
-0.2664421498775482,
0.1285037398338318,
0.05586053803563118,
0.5104143619537354,
0.4554348587989807,
0.4643450081348419,
-0.5892686247825623,
-0.5235161185264587,
-0.938620388507843,
-0.13134658336639404,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
persiannlp/parsinlu_entailment | persiannlp | 2022-10-22T15:13:00Z | 70 | 0 | null | [
"task_ids:natural-language-inference",
"annotations_creators:expert-generated",
"language_creators:expert-generated",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"source_datasets:extended|translated|mnli",
"language:fa",
"license:cc-by-nc-sa-4.0",
"arxiv:2012.06154",
"region:us"
] | 2022-10-22T15:13:00Z | 2022-03-02T23:29:22.000Z | 2022-03-02T23:29:22 | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- fa
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|translated|mnli
task_categories:
- textual-entailment
- natural-language-inference
task_ids:
- textual-entailment
- natural-language-inference
---
# Dataset Card for PersiNLU (Textual Entailment)
## Table of Contents
- [Dataset Card for PersiNLU (Textual Entailment)](#dataset-card-for-persi_nlu_entailment)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/persiannlp/parsinlu/)
- **Repository:** [Github](https://github.com/persiannlp/parsinlu/)
- **Paper:** [Arxiv](https://arxiv.org/abs/2012.06154)
- **Leaderboard:**
- **Point of Contact:** d.khashabi@gmail.com
### Dataset Summary
A Persian textual entailment task (deciding `sent1` entails `sent2`).
The questions are partially translated from the SNLI dataset and partially generated by expert annotators.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The text dataset is in Persian (`fa`).
## Dataset Structure
### Data Instances
Here is an example from the dataset:
```json
{
"sent1": "سالها است که کنگره در تلاش است تا اثربخشی مدیریت اطلاعات و فناوری را در دولت فدرال افزایش دهد.",
"sent2": "کنگره بودجه ویژه ای برای مدیریت اطلاعات و فناوری در دولت فدرال دارد.",
"label": "n",
"category": "translation-train"
}
```
### Data Fields
- `sent1`: the first sentence.
- `sent2`: the second sentence.
- `source`: whether the questions are translated from MNLI (`translation-.`) or they're written by native speakers (`natural-.`).
- `label`: `e` if `sent2` is entailed from `sent1`; `c` if `sent2` is contradictory to `sent1`; `n` if the two sentences are neutral.
### Data Splits
The train/dev/test splits contains 756/271/1751 samples.
## Dataset Creation
### Curation Rationale
For details, check [the corresponding draft](https://arxiv.org/abs/2012.06154).
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
CC BY-NC-SA 4.0 License
### Citation Information
```bibtex
@article{huggingface:dataset,
title = {ParsiNLU: A Suite of Language Understanding Challenges for Persian},
authors = {Khashabi, Daniel and Cohan, Arman and Shakeri, Siamak and Hosseini, Pedram and Pezeshkpour, Pouya and Alikhani, Malihe and Aminnaseri, Moin and Bitaab, Marzieh and Brahman, Faeze and Ghazarian, Sarik and others},
year={2020}
journal = {arXiv e-prints},
eprint = {2012.06154},
}
```
### Contributions
Thanks to [@danyaljj](https://github.com/danyaljj) for adding this dataset.
| [
-0.3607502579689026,
-0.8479819297790527,
0.24732030928134918,
0.3328794240951538,
-0.190585657954216,
-0.12939828634262085,
-0.6238914728164673,
-0.18646429479122162,
0.4157286584377289,
0.4800759553909302,
-0.7126421928405762,
-0.819656491279602,
-0.5371139645576477,
0.45767223834991455,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
batubayk/HU-News | batubayk | 2023-03-04T22:40:26Z | 70 | 0 | null | [
"task_categories:summarization",
"task_categories:text-classification",
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:100K<n<1M",
"language:hu",
"region:us"
] | 2023-03-04T22:40:26Z | 2022-04-18T17:23:27.000Z | 2022-04-18T17:23:27 | ---
task_categories:
- summarization
- text-classification
- text-generation
- text2text-generation
language:
- hu
pretty_name: HU-News
size_categories:
- 100K<n<1M
---
# Citation
If you use the dataset, please cite the paper:
@article{10.1007/s10579-021-09568-y,
year = {2022},
title = {{Abstractive text summarization and new large-scale datasets for agglutinative languages Turkish and Hungarian}},
author = {Baykara, Batuhan and Güngör, Tunga},
journal = {Language Resources and Evaluation},
issn = {1574-020X},
doi = {10.1007/s10579-021-09568-y},
pages = {1--35}} | [
-0.26775893568992615,
-0.5715453028678894,
0.03477097302675247,
0.2687060534954071,
-0.370378702878952,
-0.03354596719145775,
-0.4402487576007843,
-0.07134600728750229,
0.39891505241394043,
0.34030812978744507,
0.11779508739709854,
-0.5527652502059937,
-0.6513701677322388,
0.35464668273925... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lmqg/qg_squadshifts | lmqg | 2022-12-02T18:56:15Z | 70 | 1 | null | [
"task_categories:text-generation",
"task_ids:language-modeling",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"source_datasets:subjqa",
"language:en",
"license:cc-by-4.0",
"question-generation",
"arxiv:2210.03992",
"region:us"
] | 2022-12-02T18:56:15Z | 2022-06-02T18:56:40.000Z | 2022-06-02T18:56:40 | ---
license: cc-by-4.0
pretty_name: SubjQA for question generation
language: en
multilinguality: monolingual
size_categories: 10K<n<100K
source_datasets: subjqa
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qg_squadshifts"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is a subset of [QG-Bench](https://github.com/asahi417/lm-question-generation/blob/master/QG_BENCH.md#datasets), a unified question generation benchmark proposed in
["Generative Language Models for Paragraph-Level Question Generation: A Unified Benchmark and Evaluation, EMNLP 2022 main conference"](https://arxiv.org/abs/2210.03992).
Modified version of [SQuADShifts](https://modestyachts.github.io/squadshifts-website/index.html) for question generation (QG) task.
### Supported Tasks and Leaderboards
* `question-generation`: The dataset can be used to train a model for question generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
English (en)
## Dataset Structure
An example of 'train' looks as follows.
```
{
"question": "has there ever been a legal challange?",
"paragraph": "The status of the Armenian Apostolic Church within the Republic of Armenia is defined in the country's constitution. Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church".",
"answer": "Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church",
"sentence": "Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church",
"paragraph_sentence": "The status of the Armenian Apostolic Church within the Republic of Armenia is defined in the country's constitution. <hl> Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church". <hl>",
"paragraph_answer": "The status of the Armenian Apostolic Church within the Republic of Armenia is defined in the country's constitution. Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." <hl> Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church". <hl>",
"sentence_answer": "Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." <hl> Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church". <hl>"
}
```
The data fields are the same among all splits.
- `question`: a `string` feature.
- `paragraph`: a `string` feature.
- `answer`: a `string` feature.
- `sentence`: a `string` feature.
- `paragraph_answer`: a `string` feature, which is same as the paragraph but the answer is highlighted by a special token `<hl>`.
- `paragraph_sentence`: a `string` feature, which is same as the paragraph but a sentence containing the answer is highlighted by a special token `<hl>`.
- `sentence_answer`: a `string` feature, which is same as the sentence but the answer is highlighted by a special token `<hl>`.
Each of `paragraph_answer`, `paragraph_sentence`, and `sentence_answer` feature is assumed to be used to train a question generation model,
but with different information. The `paragraph_answer` and `sentence_answer` features are for answer-aware question generation and
`paragraph_sentence` feature is for sentence-aware question generation.
### Data Splits
| name |train | valid | test |
|-------------|------:|------:|-----:|
|default (all)|9209|6283 |18,844|
| amazon |3295|1648|4942|
| new_wiki |2646|1323|3969|
| nyt |3355|1678|5032|
| reddit |3268|1634|4901|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` | [
-0.45688894391059875,
-0.8005110025405884,
0.1920403093099594,
0.20510590076446533,
-0.4775897264480591,
-0.1768607497215271,
-0.3886236548423767,
-0.22746378183364868,
0.03393946588039398,
0.5313927531242371,
-0.6546651721000671,
-0.4513269364833832,
-0.16740179061889648,
0.44966554641723... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
imvladikon/leipzig_corpora_collection | imvladikon | 2023-11-12T08:49:08Z | 70 | 2 | null | [
"task_categories:text-generation",
"task_categories:fill-mask",
"multilinguality:multilingual",
"size_categories:n<1K",
"size_categories:1K<n<10K",
"size_categories:10K<n<100K",
"size_categories:100K<n<1M",
"size_categories:1M<n<10M",
"source_datasets:original",
"language:ar",
"language:en",
"... | 2023-11-12T08:49:08Z | 2022-06-19T16:03:28.000Z | 2022-06-19T16:03:28 | ---
language:
- ar
- en
- he
- de
- it
- fr
- pl
- pt
- ru
- uk
task_categories:
- text-generation
- fill-mask
source_datasets:
- original
multilinguality:
- multilingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
config_names:
- links
---
## Leipzig Corpora Collection
The [Leipzig Corpora Collection](https://wortschatz.uni-leipzig.de/en/download) presents corpora in different languages using the same format and comparable sources. All data are available as plain text files and can be imported into a MySQL database by using the provided import script. They are intended both for scientific use by corpus linguists as well as for applications such as knowledge extraction programs.
The corpora are identical in format and similar in size and content. They contain randomly selected sentences in the language of the corpus and are available in sizes from 10,000 sentences up to 1 million sentences. The sources are either newspaper texts or texts randomly collected from the web. The texts are split into sentences. Non-sentences and foreign language material was removed. Because word co-occurrence information is useful for many applications, these data are precomputed and included as well. For each word, the most significant words appearing as immediate left or right neighbor or appearing anywhere within the same sentence are given. More information about the format and content of these files can be found [here](https://wortschatz.uni-leipzig.de/en/download).
The corpora are automatically collected from carefully selected public sources without considering in detail the content of the contained text. No responsibility is taken for the content of the data. In particular, the views and opinions expressed in specific parts of the data remain exclusively with the authors.
## Dataset Usage
### Links
A "links" subset contains URLs with corresponding language and id (based on `https://corpora.uni-leipzig.de/`)
```python
from datasets import load_dataset
ds = load_dataset("imvladikon/leipzig_corpora_collection", "links")
for row in ds["train"]:
print(row)
```
```
{'id': '0', 'data_id': '0', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/ara_news_2005-2009_10K.tar.gz', 'language': 'Arabic', 'language_short': 'ara', 'year': '2005', 'size': '10K'}
{'id': '1', 'data_id': '1', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/ara_news_2005-2009_30K.tar.gz', 'language': 'Arabic', 'language_short': 'ara', 'year': '2005', 'size': '30K'}
{'id': '2', 'data_id': '2', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/ara_news_2005-2009_100K.tar.gz', 'language': 'Arabic', 'language_short': 'ara', 'year': '2005', 'size': '100K'}
....
```
where is possible to choose specific `data_id` to load a specific dataset, where `data_id` is name of the subset
Links possible to filter according to metdata attributes:
```python
links = load_dataset("imvladikon/leipzig_corpora_collection", "links", split="train")
english_2019 = links.filter(lambda x: x["language"] == "English" and x["year"] == "2019")
for sample in english_2019:
print(sample)
```
```
{'id': '277', 'data_id': 'eng_news_2019_10K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng_news_2019_10K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '10K'}
{'id': '278', 'data_id': 'eng_news_2019_30K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng_news_2019_30K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '30K'}
{'id': '279', 'data_id': 'eng_news_2019_100K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng_news_2019_100K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '100K'}
{'id': '280', 'data_id': 'eng_news_2019_300K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng_news_2019_300K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '300K'}
{'id': '281', 'data_id': 'eng_news_2019_1M', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng_news_2019_1M.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '1M'}
{'id': '541', 'data_id': 'eng-za_web_2019_10K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng-za_web_2019_10K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '10K'}
{'id': '542', 'data_id': 'eng-za_web_2019_30K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng-za_web_2019_30K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '30K'}
{'id': '543', 'data_id': 'eng-za_web_2019_100K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng-za_web_2019_100K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '100K'}
{'id': '544', 'data_id': 'eng-za_web_2019_300K', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng-za_web_2019_300K.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '300K'}
{'id': '545', 'data_id': 'eng-za_web_2019_1M', 'url': 'https://downloads.wortschatz-leipzig.de/corpora/eng-za_web_2019_1M.tar.gz', 'language': 'English', 'language_short': 'eng', 'year': '2019', 'size': '1M'}
```
### Corpus
after selecting `data_id`, let's say `heb_wikipedia_2021_1M`, we could load it:
```python
dataset_he = load_dataset("imvladikon/leipzig_corpora_collection", "heb_wikipedia_2021_1M", split="train")
for row in dataset_he:
print(row)
```
another example:
```python
dataset_en = load_dataset("imvladikon/leipzig_corpora_collection", "eng-simple_wikipedia_2021_300K", split="train")
print(dataset_en[76576])
```
sample:
```json
{'id': '79214', 'sentence': 'He was a member of the assembly from 1972 to 1977.'}
```
## Citation
If you use one of these corpora in your work, please, to cite [this work](http://www.lrec-conf.org/proceedings/lrec2012/pdf/327_Paper.pdf):
```
@inproceedings{goldhahn-etal-2012-building,
title = "Building Large Monolingual Dictionaries at the {L}eipzig Corpora Collection: From 100 to 200 Languages",
author = "Goldhahn, Dirk and
Eckart, Thomas and
Quasthoff, Uwe",
editor = "Calzolari, Nicoletta and
Choukri, Khalid and
Declerck, Thierry and
Do{\u{g}}an, Mehmet U{\u{g}}ur and
Maegaard, Bente and
Mariani, Joseph and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Eighth International Conference on Language Resources and Evaluation ({LREC}'12)",
month = may,
year = "2012",
address = "Istanbul, Turkey",
publisher = "European Language Resources Association (ELRA)",
url = "http://www.lrec-conf.org/proceedings/lrec2012/pdf/327_Paper.pdf",
pages = "759--765",
abstract = "The Leipzig Corpora Collection offers free online access to 136 monolingual dictionaries enriched with statistical information. In this paper we describe current advances of the project in collecting and processing text data automatically for a large number of languages. Our main interest lies in languages of low density, where only few text data exists online. The aim of this approach is to create monolingual dictionaries and statistical information for a high number of new languages and to expand the existing dictionaries, opening up new possibilities for linguistic typology and other research. Focus of this paper will be set on the infrastructure for the automatic acquisition of large amounts of monolingual text in many languages from various sources. Preliminary results of the collection of text data will be presented. The mainly language-independent framework for preprocessing, cleaning and creating the corpora and computing the necessary statistics will also be depicted.",
}
``` | [
-0.8370133638381958,
-0.514775276184082,
0.17559029161930084,
-0.026826102286577225,
-0.37133321166038513,
0.221150204539299,
-0.5130163431167603,
-0.4616297781467438,
0.5025070905685425,
0.08276890218257904,
-0.31347888708114624,
-0.8524379730224609,
-0.3652488589286804,
0.462726682424545... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ghmfx/natural-questions-short | ghmfx | 2022-12-17T21:29:07Z | 70 | 0 | null | [
"license:wtfpl",
"region:us"
] | 2022-12-17T21:29:07Z | 2022-12-17T21:28:43.000Z | 2022-12-17T21:28:43 | ---
license: wtfpl
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
cbasu/Med-EASi | cbasu | 2023-03-08T18:24:31Z | 70 | 1 | null | [
"arxiv:2302.09155",
"region:us"
] | 2023-03-08T18:24:31Z | 2023-03-08T00:27:53.000Z | 2023-03-08T00:27:53 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Med-EASi
## Dataset Description
- **Repository:https://github.com/Chandrayee/CTRL-SIMP**
- **Paper:https://arxiv.org/pdf/2302.09155.pdf**
- **Point of Contact:Chandrayee Basu**
### Dataset Summary
Med-EASi (Medical dataset for Elaborative and Abstractive Simplification), a uniquely crowdsourced and finely annotated dataset for supervised simplification of short medical
texts. It contains 1979 expert-simple text pairs in medical domain, spanning a total of 4478 UMLS concepts across all text pairs. The dataset is annotated with four textual transformations:
replacement, elaboration, insertion and deletion.
### Supported Tasks
The dataset can be used for direct generation of simplified medical text or generation of simplified text along with controllability over individual transformations. Please refer to the paper for more information.
### Languages
English
## Dataset Structure
- **train.csv: 1397 text pairs (5.19 MB)**
- **validation.csv: 197 text pairs (1.5 MB)**
- **test.csv: 300 text pairs (1.19 MB)**
We also provide several metrics per data point including Levenstein similarity, SentenceBERT embedding cosine similarity, compression ratio, Flesch Kincaid readability grade,
automated readability index for each of the expert and simple text, and UMLS concepts in each of them.
### Data Instances
```
Expert: Some patients have weight loss, rarely enough to become underweight. Anemia, glossitis, angular stomatitis, and aphthous ulcers are usually seen in these patients.
Simple: Some people are undernourished, have mild weight loss and anemia, or have mouth sores and an inflamed tongue.
Annotated: Some <elab>patients<by>people are undernourished,</elab> have <elab>weight loss<by>mild weight loss</elab><del>, rarely enough to become underweight.</del> <rep>Anemia, glossitis, angular stomatitis, and aphthous ulcers<by>and anemia, or have mouth sores and an inflamed tongue</rep><del>usually seen in these patients</del>.
```
### Data Fields
```
Expert
Simple
Annotation
sim (Levenstein Similarity)
sentence_sim (SentenceBERT embedding cosine similarity)
compression
expert_fk_grade
expert_ari
layman_fk_grade
layman_ari
umls_expert
umls_layman
expert_terms
layman_terms
idx (original data index before shuffling, redundant)
```
### Data Splits
75 % train, 10 % validation and 15 % test
## Dataset Creation
This dataset is created by annotating 1500 SIMPWIKI data points (Van den Bercken, Sips, and Lofi 2019) and all of MSD (Cao et al. 2020) data points. We used expert-layman-AI collaboration for annotation.
### Personal and Sensitive Information
There is no personal or sensitive information in this dataset.
## Considerations for Using the Data
### Discussion of Biases
The dataset contains biomedical and clinical short texts.
### Other Known Limitations
The expert and simple texts in the original datasets were extracted and aligned using automated methods that have their own limitations.
### Citation Information
```
@article{basu2023med,
title={Med-EASi: Finely Annotated Dataset and Models for Controllable Simplification of Medical Texts},
author={Basu, Chandrayee and Vasu, Rosni and Yasunaga, Michihiro and Yang, Qian},
journal={arXiv preprint arXiv:2302.09155},
year={2023}
}
``` | [
-0.20194734632968903,
-0.6156529784202576,
0.5026124119758606,
0.012738651596009731,
-0.21071600914001465,
-0.355458527803421,
-0.21073085069656372,
-0.3951442241668701,
0.6734070181846619,
0.4095623195171356,
-0.36497899889945984,
-0.7416316270828247,
-0.4548059403896332,
0.47227799892425... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
andreped/IBDColEpi | andreped | 2023-11-08T22:02:54Z | 70 | 0 | null | [
"task_categories:image-segmentation",
"size_categories:1B<n<10B",
"language:en",
"license:mit",
"medical",
"region:us"
] | 2023-11-08T22:02:54Z | 2023-05-29T15:32:48.000Z | 2023-05-29T15:32:48 | ---
license: mit
task_categories:
- image-segmentation
language:
- en
tags:
- medical
pretty_name: IBDColEpi
size_categories:
- 1B<n<10B
---
# IBDColEpi: 140 HE and 111 CD3-stained colon biopsies of active and inactivate inflammatory bowel disease with epithelium annotated
To access and work with the data in Python, you can do so through the Python API with datasets. See this Jupyter Notebook on how to get started:
https://github.com/andreped/NoCodeSeg/blob/main/notebooks/IBDColEpi-load-dataset-example.ipynb
Note that it is also possible to download the data through the web interface at Hugging Face, but also through [this google drive](https://drive.google.com/drive/u/0/folders/1eUVs1DA1UYayUYjr8_aY3O5xDgV1uLvH)
and [this dataverseNO](https://dataverse.no/dataset.xhtml?persistentId=doi:10.18710/TLA01U) link.
--------------------
GENERAL INFORMATION
--------------------
1. Title of Dataset: 140 HE and 111 CD3-stained colon biopsies of active and inactivate inflammatory bowel disease with epithelium annotated: the IBDColEpi dataset
2. DOI: https://doi.org/10.18710/TLA01U
3. Contact Information
Name: André Pedersen
Institution: NTNU Norwegian University of Science and Technology
Email: andre.pedersen@ntnu.no
ORCID: https://orcid.org/0000-0002-3637-953X
4. Contributors: See metadata field Contributor.
5. Kind of data: See metadata field Kind of Data.
6. Date of data collection/generation: See metadata field Date of Collection.
7. Geographic location: See metadata section Geographic Coverage.
8. Funding sources: See metadata section Grant Information.
9. Description of dataset:
General description and ethics approvals: The dataset contains 140 HE and 111 CD3 stained, formalin fixed paraffin embedded (FFPE) biopsies of colonic mucosa. The biopsies were extracted from the NTNU/St. Olavs hospital, Trondheim University Hospital (Norway) biobank of patients with confirmed inflammatory bowel disease or healthy controls with gastrointestinal symptoms but no macroscopic- or microscopic disease. Inclusion and colonoscopies were performed at the Department of Gastroenterology and Hepatology at St. Olavs hospital, Trondheim University Hospital from 2007 to 2018. All patients gave written informed consent and ethical approvals were obtained from the Central Norway Regional Committee for Medical and Health Research Ethics (reference number 2013/212/REKMidt). Consent to publish the anonymized whole slide image (WSI) dataset was given by REKMidt in 2021. Each database ID number used in this study was changed to new anonymized IDs only containing the information “active” or “inactive” disease and whether the WSI has haematoxylin-eosin (HE) staining or CD3 immunostaining. The biopsies included in the biobank are sampled such that one biopsy from an unaffected/inactive area and one from an area affected/active area were included from each patient and given a separate ID number. Hence, two biopsies with different ID numbers can be from the same patient. "Active" is defined as the presence of intraepithelial granulocytes in one or more location in the biopsies. Still, the changes may be focal, hence majority of the epithelium may still lack intraepithelial granulocytes or other signs of active disease (crypt abscesses, granulation tissue, etc.).
---------------------------
SHARING/ACCESS INFORMATION
---------------------------
(See metadata record for dataset.)
1. Licenses/Restrictions: See Terms section.
2. Links to publications that cite or use the data: See metadata field Related Publication.
3. Links/relationships to related data sets: See metadata field Related Datasets.
4. Data sources: See metadata field Data Sources.
5. Recommended citation: See citation generated by repository.
---------------------
DATA & FILE OVERVIEW
---------------------
1. File List:
00_README.txt
trained-models.zip
patch-dataset-CD3.zip
patch-dataset-HE.zip
qupath-project-annotations.zip
TIFF-annotations.zip
WSI_part_01.zip
WSI_part_02.zip
WSI_part_03.zip
WSI_part_04.zip
WSI_part_05.zip
WSI_part_06.zip
WSI_part_07.zip
WSI_part_08.zip
WSI_part_09.zip
WSI_part_10.zip
2. Relationship between files, if important:
- trained-models.zip: the best performing trained models (for both HE and CD3) on the images from WSI_part_*.zip using the manual delineations from TIFF-annotations.zip.
- WSI_path_*.zip: the colon biopsies described in the metadata (1-10). For each ID, the active/inactive label Y is stored in the filename, with the format: "ID-X_Y.ndpi".
- TIFF-annotations.zip: the corresponding annotations to the WSIs. The filenames of the annotations are in the same structure as the corresponding WSIs, with the format: "ID-X_Y.tiff".
- patch-dataset-*.zip: the corresponding patch images and labels, split into train/validation/test sets, relevant for the evaluation of the design in the publication. Both for HE and CD3
- qupath-project-annotations.zip: the qupath project file, also containing the annotations of all WSIs, but can be directly read in QuPath (after renaming of WSI paths). | [
-0.30295974016189575,
-0.2987971901893616,
0.4724312722682953,
-0.01915677636861801,
-0.14176267385482788,
0.013313785195350647,
0.07651891559362411,
-0.48497962951660156,
0.4946693181991577,
0.3930535614490509,
-0.13257263600826263,
-0.5378428101539612,
-0.4817637503147125,
0.566926598548... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Someman/hindi-summarization | Someman | 2023-05-30T12:55:13Z | 70 | 0 | null | [
"task_categories:summarization",
"size_categories:10K<n<100K",
"language:hi",
"license:mit",
"region:us"
] | 2023-05-30T12:55:13Z | 2023-05-30T12:39:11.000Z | 2023-05-30T12:39:11 | ---
license: mit
task_categories:
- summarization
language: hi
original_source: >-
https://www.kaggle.com/datasets/disisbig/hindi-text-short-and-large-summarization-corpus
dataset_info:
features:
- name: headline
dtype: string
- name: summary
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 410722079.5542422
num_examples: 55226
- name: test
num_bytes: 102684238.44575782
num_examples: 13807
- name: valid
num_bytes: 128376473
num_examples: 17265
download_size: 150571314
dataset_size: 641782791
pretty_name: hindi summarization
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- Homepage: https://www.kaggle.com/datasets/disisbig/hindi-text-short-and-large-summarization-corpus?select=test.csv
### Dataset Summary
Hindi Text Short and Large Summarization Corpus is a collection of ~180k articles with their headlines and summary collected from Hindi News Websites.
This is a first of its kind Dataset in Hindi which can be used to benchmark models for Text summarization in Hindi. This does not contain articles contained in Hindi Text Short Summarization Corpus which is being released parallely with this Dataset.
The dataset retains original punctuation, numbers etc in the articles.
### Languages
The language is Hindi.
### Licensing Information
MIT
### Citation Information
https://www.kaggle.com/datasets/disisbig/hindi-text-short-and-large-summarization-corpus?select=test.csv
### Contributions
| [
-0.2616473138332367,
-0.6810623407363892,
0.033742476254701614,
0.48098066449165344,
-0.589948296546936,
0.32149145007133484,
-0.3863726258277893,
-0.04925035312771797,
0.3525868356227875,
0.32421907782554626,
-0.4155081510543823,
-0.6499976515769958,
-0.7423269748687744,
0.487880617380142... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
allenai/dolma | allenai | 2023-11-27T05:26:01Z | 70 | 346 | null | [
"task_categories:text-generation",
"size_categories:n>1T",
"language:en",
"license:other",
"language-modeling",
"casual-lm",
"llm",
"region:us"
] | 2023-11-27T05:26:01Z | 2023-06-30T20:14:39.000Z | 2023-06-30T20:14:39 | ---
license: other
license_name: impact-license-medium-risk
license_link: https://allenai.org/licenses/impact-mr
viewer: false
task_categories:
- text-generation
language:
- en
tags:
- language-modeling
- casual-lm
- llm
pretty_name: Dolma
size_categories:
- n>1T
extra_gated_prompt: "Access to this dataset is automatically granted upon accepting the [**AI2 ImpACT License - Medium Risk Artifacts (“MR Agreement”)**](https://allenai.org/licenses/impact-mr) and completing all fields below."
extra_gated_fields:
Your full name: text
Organization or entity you are affiliated with: text
State or country you are located in: text
Contact email: text
Please describe your intended use of the medium risk artifact(s): text
I AGREE to the terms and conditions of the MR Agreement above: checkbox
I AGREE to AI2’s use of my information for legal notices and administrative matters: checkbox
I CERTIFY that the information I have provided is true and accurate: checkbox
---
# Dolma
<img alt="Dolma's official logo. It's dolma written in yellow, round lowercase letters over a blue background." src="https://raw.githubusercontent.com/allenai/dolma/main/docs/assets/AI2_Blog_1400x685_2x.webp" width="100%">
Dolma is a dataset of 3 trillion tokens from a diverse mix of web content, academic publications, code, books, and encyclopedic materials. It is openly released under AI2’s ImpACT license as a medium risk artifact.
More information:
- Read Dolma **announcement blogpost** [on Medium](https://soldni.medium.com/dolma-3-trillion-tokens-open-llm-corpus-9a0ff4b8da64);
- Learn more about Dolma on its [**Data Sheet**](https://drive.google.com/file/d/12gOf5I5RytsD159nSP7iim_5zN31FCXq/view?usp=drive_link);
- Review Dolma's [**ImpACT license** for medium risk artifacts](https://allenai.org/licenses/impact-mr);
- Explore the [**open source tools**](https://github.com/allenai/dolma) we created to curate Dolma.
- Want to request removal of personal data? Use [this form](https://forms.gle/q4BNUUxUxKwKkfdT6) to notify us of documents containing PII about a specific user.
To learn more about the toolkit used to create Dolma, including how to replicate this dataset, head over our [GitHub project page](https://github.com/allenai/dolma/tree/main/docs)!
## Summary Statistics
|**Source**|**Type**|**Gzip files (GB)**|**Documents (millions)**|**[GPT-NeoX](https://huggingface.co/EleutherAI/gpt-neox-20b) Tokens (billions)**|
|:---|:---:|:---:|:---:|:----:|
|[CommonCrawl](https://commoncrawl.org/)|web|4,197|4,600|2,415|
|[C4](https://huggingface.co/datasets/allenai/c4)|web|302|364|175|
|[peS2o](https://huggingface.co/datasets/allenai/peS2o)|academic|150|38.8|57|
|[The Stack](https://huggingface.co/datasets/bigcode/the-stack)|code|319|236|430|
|[Project Gutenberg](https://www.gutenberg.org/)|books|6.6|0.052|4.8|
|[Wikipedia](https://dumps.wikimedia.org/)|encyclopedic|5.8|6.1|3.6|
||**Total** |**4980.4**|**5,245**|**3,084**|
## Download
The fastest way to download Dolma is to directly download the individual files across multiple threads.
This can be achieved using wget or [aria2](https://github.com/aria2/aria2) Linux/Mac/Windows package (`sudo apt-get install aria2` on Ubuntu).
For downloading individual files, simply use `wget` as follows:
`wget --header 'Authorization: Bearer YOUR_HF_HUB_ACCESS_TOKEN' https://huggingface.co/datasets/allenai/dolma/resolve/main/data/peS2o/s2_v3-0000.json.gz`
For downloading many files across multiple threads, first prepare a `.txt` file with the urls you would like such as via the script below:
```python
OUT_DIRECTORY = "/scratch/dolma/data"
# URLs for cc_en_head
cc_en_head_base_url = "https://huggingface.co/datasets/allenai/dolma/resolve/main/data/common-crawl/cc_en_head/cc_en_head-"
cc_en_head_url_list = [f"{cc_en_head_base_url}{str(i).zfill(4)}.json.gz\n dir={OUT_DIRECTORY}/cc_en_head\n out=cc_en_head-{str(i).zfill(4)}.json.gz" for i in range(612)]
# URLs for cc_en_middle
cc_en_middle_base_url = "https://huggingface.co/datasets/allenai/dolma/resolve/main/data/common-crawl/cc_en_middle/cc_en_middle-"
cc_en_middle_url_list = [f"{cc_en_middle_base_url}{str(i).zfill(4)}.json.gz\n dir={OUT_DIRECTORY}/cc_en_middle\n out=cc_en_middle-{str(i).zfill(4)}.json.gz" for i in range(777)]
# URLs for cc_en_tail
cc_en_tail_base_url = "https://huggingface.co/datasets/allenai/dolma/resolve/main/data/common-crawl/cc_en_tail/cc_en_tail-"
cc_en_tail_url_list = [f"{cc_en_tail_base_url}{str(i).zfill(4)}.json.gz\n dir={OUT_DIRECTORY}/cc_en_tail\n out=cc_en_tail-{str(i).zfill(4)}.json.gz" for i in range(1493)]
# URLs for s2_v3
s2_v3_base_url = "https://huggingface.co/datasets/allenai/dolma/resolve/main/data/peS2o/s2_v3-"
s2_v3_url_list = [f"{s2_v3_base_url}{str(i).zfill(4)}.json.gz\n dir={OUT_DIRECTORY}/peS2o\n out=s2_v3-{str(i).zfill(4)}.json.gz" for i in range(42)]
# URLs for The Stack
LANG_TO_FILES = {'lasso': 1, 'nsis': 1, 'literate-agda': 1, 'metal': 1, 'xojo': 1, 'max': 8, 'jupyter-notebook': 101, 'asp': 7, 'elixir': 14, 'html+erb': 19, 'julia': 22, 'dart': 63, 'ragel-in-ruby-host': 1, 'api-blueprint': 1, 'gams': 1, 'tex': 71, 'xml': 101, 'smalltalk': 17, 'cmake': 11, 'piglatin': 1, "cap'n-proto": 1, 'common-lisp': 21, 'stylus': 3, 'typescript': 101, 'jflex': 1, 'factor': 1, 'arc': 1, 'parrot-internal-representation': 1, 'aspectj': 1, 'go': 101, 'urweb': 1, 'dns-zone': 1, 'purebasic': 1, 'toml': 15, 'erlang': 11, 'hy': 1, 'component-pascal': 2, 'oz': 1, 'opa': 1, 'handlebars': 10, 'gas': 15, 'less': 17, 'gnuplot': 15, 'harbour': 1, 'vhdl': 16, 'octave': 1, 'powershell': 21, 'clips': 1, 'fish': 1, 'prolog': 1, 'sparql': 1, 'objective-j': 1, 'scaml': 1, 'twig': 20, 'gettext-catalog': 101, 'purescript': 2, 'vala': 1, 'gosu': 1, 'apacheconf': 1, 'xc': 1, 'lean': 3, 'mako': 1, 'r': 4, 'unrealscript': 1, 'solidity': 21, 'pike': 1, 'cartocss': 1, 'maple': 1, 'graphql': 3, 'unity3d-asset': 101, 'swift': 101, 'dockerfile': 13, 'digital-command-language': 1, 'scala': 83, 'sqf': 2, 'logtalk': 1, 'coq': 1, 'shellsession': 1, 'befunge': 1, 'nu': 1, 'ecere-projects': 1, 'zimpl': 1, 'shen': 1, 'golo': 1, 'web-ontology-language': 12, 'sas': 2, 'uno': 1, 'livescript': 1, 'literate-haskell': 1, 'clojure': 8, 'perl6': 1, 'zig': 3, 'liquid': 2, 'ec': 1, 'blitzbasic': 1, 'sql': 101, 'http': 2, 'xproc': 1, 'kit': 1, 'textile': 1, 'netlinx': 1, 'propeller-spin': 1, 'cython': 5, 'realbasic': 1, 'dogescript': 1, 'llvm': 9, 'pawn': 1, 'groff': 40, 'html+django': 3, 'csound': 1, 'd': 1, 'agda': 2, 'css': 101, 'yacc': 7, 'robotframework': 1, 'kotlin': 101, 'grace': 1, 'abap': 2, 'blitzmax': 1, 'webassembly': 3, 'ampl': 1, 'postscript': 16, 'nit': 1, 'gentoo-eclass': 1, 'xpages': 1, 'linker-script': 2, 'yang': 3, 'jade': 4, 'standard-ml': 6, 'javascript': 101, 'moonscript': 1, 'mtml': 1, 'saltstack': 1, 'freemarker': 5, 'ston': 1, 'html+eex': 1, 'xs': 1, 'c++': 101, 'matlab': 1, 'm4': 2, 'xbase': 1, 'perl': 37, 'emacs-lisp': 7, 'bison': 1, 'slim': 2, 'grammatical-framework': 1, 'rdoc': 1, 'nix': 10, 'clean': 1, 'module-management-system': 1, 'nimrod': 6, 'raml': 1, 'forth': 1, 'squirrel': 1, 'alloy': 1, 'opencl': 3, 'c': 101, 'sass': 4, 'eiffel': 2, 'papyrus': 1, 'html': 109, 'java': 101, 'hcl': 14, 'isabelle': 2, 'markdown': 101, 'gentoo-ebuild': 2, 'objdump': 1, 'emberscript': 1, 'text': 101, 'bro': 1, 'opal': 1, 'haskell': 35, 'mupad': 1, 'desktop': 1, 'modelica': 2, 'coldfusion-cfc': 2, 'fantom': 1, 'glsl': 10, 'ocaml': 16, 'nesc': 2, 'scheme': 7, 'crystal': 5, 'tcsh': 1, 'c2hs-haskell': 1, 'idris': 1, 'logos': 4, 'coffeescript': 13, 'g-code': 10, 'sage': 1, 'haml': 4, 'tcl': 7, 'smt': 5, 'ox': 1, 'chuck': 1, 'xquery': 1, 'batchfile': 7, 'pod': 2, 'xtend': 1, 'restructuredtext': 61, 'rmarkdown': 1, 'turtle': 33, 'jsx': 45, 'protocol-buffer': 8, "ren'py": 2, 'diff': 32, 'slash': 1, 'darcs-patch': 1, 'numpy': 1, 'augeas': 1, 'wisp': 1, 'edn': 15, 'ooc': 1, 'bitbake': 2, 'labview': 1, 'inform-7': 1, 'rust': 101, 'creole': 1, 'apl': 1, 'arduino': 11, 'openscad': 2, 'cuda': 9, 'thrift': 1, 'yaml': 101, 'fancy': 1, 'coldfusion': 1, 'python': 101, 'clarion': 1, 'glyph': 1, 'parrot': 1, 'lookml': 1, 'java-server-pages': 19, 'oxygene': 1, 'flux': 1, 'scilab': 1, 'groovy-server-pages': 2, 'rhtml': 1, 'eagle': 52, 'parrot-assembly': 1, 'igor-pro': 1, 'webidl': 1, 'bluespec': 1, 'unified-parallel-c': 1, 'smali': 38, 'haxe': 9, 'ada': 7, 'lua': 48, 'pascal': 21, 'html+php': 6, 'irc-log': 1, 'x10': 1, 'netlogo': 1, 'ioke': 1, 'dm': 1, 'self': 1, 'elm': 5, 'ats': 1, 'brainfuck': 1, 'mask': 1, 'rouge': 1, 'turing': 1, 'lex': 2, 'gap': 1, 'pogoscript': 1, 'kicad': 30, 'io': 1, 'objective-c++': 8, 'qml': 4, 'redcode': 1, 'autoit': 2, 'processing': 4, 'systemverilog': 6, 'gdscript': 5, 'f-sharp': 12, 'fortran': 23, 'monkey': 1, 'c-sharp': 101, 'xslt': 9, 'viml': 6, 'renderscript': 1, 'scss': 84, 'cucumber': 4, 'verilog': 1, 'genshi': 1, 'racket': 1, 'krl': 1, 'actionscript': 10, 'pan': 1, 'cirru': 1, 'chapel': 1, 'pure-data': 2, 'm': 1, 'applescript': 1, 'inno-setup': 1, 'volt': 1, 'myghty': 1, 'groovy': 17, 'ags-script': 1, 'mirah': 1, 'lsl': 1, 'brightscript': 1, 'python-traceback': 1, 'sourcepawn': 2, 'maxscript': 1, 'zephir': 1, 'supercollider': 1, 'mathematica': 20, 'awk': 1, 'autohotkey': 2, 'lfe': 1, 'ruby': 101, 'visual-basic': 20, 'ini': 59, 'red': 1, 'omgrofl': 1, 'idl': 1, 'rebol': 1, 'vue': 101, 'ninja': 2, 'ecl': 1, 'lolcode': 1, 'tea': 1, 'txl': 1, 'smarty': 9, 'vcl': 1, 'php': 101, 'literate-coffeescript': 1, 'click': 1, 'pony': 1, 'mediawiki': 5, 'stata': 5, 'stan': 1, 'nginx': 1, 'asciidoc': 16, 'antlr': 1, 'cobol': 1, 'org': 5, 'latte': 1, 'makefile': 32, 'ceylon': 1, 'graphviz-(dot)': 13, 'lilypond': 1, 'dylan': 1, 'qmake': 1, 'muf': 1, 'j': 1, 'pov-ray-sdl': 1, 'jasmin': 1, 'shell': 73, 'cycript': 1, 'boo': 1, 'hlsl': 2}
stack_base_url = "https://huggingface.co/datasets/allenai/dolma/resolve/main/data/stack-code/"
stack_url_list = []
for lang, num_files in sorted(LANG_TO_FILES.items()):
for i in range(num_files):
stack_url_list.append(f"{stack_base_url}{lang}/v3-{str(i).zfill(4)}.json.gz\n dir={OUT_DIRECTORY}/stack-code/{lang}\n out=v3-{str(i).zfill(4)}.json.gz")
# Combine all URL lists
all_url_list = cc_en_head_url_list + cc_en_middle_url_list + cc_en_tail_url_list + s2_v3_url_list + stack_url_list
out = open("files.txt", "a")
# Print the combined list of URLs
for i, url in enumerate(all_url_list):
out.write(url + "\n")
```
Then you can download them all in parallel using:
`aria2c --input-file files.txt --header 'Authorization: Bearer YOUR_HF_HUB_ACCESS_TOKEN'`
You can also add `-s` to increase the number of connections, e.g. `-s 10` (defaults to 5).
To get the exact file counts that are used for The Stack in the above script (`LANG_TO_FILES`), you can follow the below:
Fetch all files (does not download them, so should be fast): `GIT_LFS_SKIP_SMUDGE=1 git clone git@hf.co:datasets/allenai/dolma.git`
Then run:
```python
import os
directory = "dolma/data/stack-code"
folder_dict = {}
for folder in os.listdir(directory):
folder_path = os.path.join(directory, folder)
if os.path.isdir(folder_path):
file_count = len([f for f in os.listdir(folder_path) if os.path.isfile(os.path.join(folder_path, f))])
folder_dict[folder] = file_count
print(folder_dict)
```
| [
-0.5167892575263977,
-0.5468116402626038,
0.2582781910896301,
0.0716937854886055,
-0.03642783313989639,
0.4475589096546173,
-0.1702003926038742,
-0.1961478441953659,
0.5595974922180176,
0.2060997039079666,
-0.5839243531227112,
-0.8254532814025879,
-0.5435904264450073,
0.2304079532623291,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
TrainingDataPro/speech-emotion-recognition-dataset | TrainingDataPro | 2023-09-19T19:34:11Z | 70 | 1 | null | [
"task_categories:audio-classification",
"language:en",
"license:cc-by-nc-nd-4.0",
"code",
"legal",
"region:us"
] | 2023-09-19T19:34:11Z | 2023-07-13T12:46:41.000Z | 2023-07-13T12:46:41 | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- audio-classification
tags:
- code
- legal
dataset_info:
features:
- name: set_id
dtype: string
- name: euphoric
dtype: audio
- name: joyfully
dtype: audio
- name: sad
dtype: audio
- name: surprised
dtype: audio
- name: text
dtype: string
- name: gender
dtype: string
- name: age
dtype: int8
- name: country
dtype: string
splits:
- name: train
num_bytes: 17202
num_examples: 20
download_size: 28409585
dataset_size: 17202
---
# Emotions on Audio Dataset
The audio dataset consists of a collection of texts spoken with four distinct emotions. These texts are spoken in English and represent four different emotional states: **euphoria, joy, sadness and surprise**.
Each audio clip captures the tone, intonation, and nuances of speech as individuals convey their emotions through their voice.
The dataset includes a diverse range of speakers, ensuring variability in *age, gender, and cultural backgrounds*, allowing for a more comprehensive representation of the emotional spectrum.
The dataset is labeled and organized based on the emotion expressed in each audio sample, making it a valuable resource for emotion recognition and analysis. Researchers and developers can utilize this dataset to train and evaluate machine learning models and algorithms, aiming to accurately recognize and classify emotions in speech.
### The audio dataset also provides an opportunity for various applications:
- sentiment analysis
- automatic emotion detection
- emotional speech synthesis.
- voice assistants
- customer service
- mental health analysis
- entertainment industries

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=speech-emotion-recognition-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Content
- **files**: includes folders corresponding to people and containing text spoken in English in 4 different manners: **euphoric, joyfully, sad and surprised**
- **.csv** file: contains information about people in the dataset
### File with the extension .csv
includes the following information for each set of media files:
- **set_id**: link to the set of audio files,
- **text**: text spoken in the audio set,
- **gender**: gender of the person,
- **age**: age of the person,
- **country**: country of the person
# Audio with emotions might be collected in accordance with your requirements.
## [TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=speech-emotion-recognition-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** | [
-0.45005810260772705,
-0.33792752027511597,
0.10435549914836884,
0.3578565716743469,
-0.0759793072938919,
0.08975107222795486,
-0.4696478843688965,
-0.4374532997608185,
0.29739731550216675,
0.31077784299850464,
-0.8224388360977173,
-0.9305381178855896,
-0.5144615173339844,
0.14699353277683... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MedHALT/Med-HALT | MedHALT | 2023-08-10T15:27:31Z | 70 | 6 | null | [
"license:apache-2.0",
"arxiv:2307.15343",
"region:us"
] | 2023-08-10T15:27:31Z | 2023-08-10T07:30:11.000Z | 2023-08-10T07:30:11 | ---
license: apache-2.0
configs:
- config_name: IR_abstract2pubmedlink
data_files: "IR_abstract2pubmedlink/IR_abstract2pubmedlink.csv"
- config_name: IR_pubmedlink2title
data_files: "IR_pubmedlink2title/IR_pubmedlink2title.csv"
- config_name: IR_pmid2title
data_files: "IR_pmid2title/IR_pmid2title.csv"
- config_name: IR_title2pubmedlink
data_files: "IR_title2pubmedlink/IR_title2pubmedlink.csv"
- config_name: reasoning_fake
data_files: "reasoning_fake/reasoning_fake.csv"
- config_name: reasoning_nota
data_files: "reasoning_nota/reasoning_nota.csv"
- config_name: reasoning_FCT
data_files: "reasoning_FCT/reasoning_FCT.csv"
---
# Med-HALT: Medical Domain Hallucination Test for Large Language Models
This is a dataset used in the [Med-HALT](https://arxiv.org/abs/2307.15343) research paper. This research paper focuses on the challenges posed by hallucinations in large language models (LLMs), particularly in the context of the medical domain. We propose a new benchmark and dataset, Med-HALT (Medical Domain Hallucination Test), designed specifically to evaluate hallucinations.
Med-HALT provides a diverse multinational dataset derived from medical examinations across various countries and includes multiple innovative testing modalities. Med-HALT includes two categories of tests reasoning and memory-based hallucination tests, designed to assess LLMs' problem-solving and information retrieval abilities. Our study evaluated leading LLMs, including Text Davinci, GPT-3.5, LlaMa and Falcon, revealing significant differences in their performance. The paper provides detailed insights into the dataset, promoting transparency and reproducibility. Through this work, we aim to contribute to the development of safer and more reliable language models in healthcare. Our benchmark can be found at https://github.com/medhalt/medhalt
## Benchmark
The Med-HALT framework proposes a two-tiered approach to evaluate the presence and impact of hallucinations in generated outputs.
#### Reasoning Hallucination Tests (RHTs)
<details>
<summary>False Confidence Test (FCT)</summary>
The False Confidence Test (FCT) involves presenting a multiple-choice medical question and a randomly suggested correct answer to the language model, tasking it with evaluating the validity of the proposed answer and providing detailed explanations for its correctness or incorrectness, in addition to explaining why the other options are wrong.
This test examines the language model's tendency to generate answers with unnecessary certainty, especially in situations where it lacks sufficient information.
</details>
<details>
<summary>None of the Above Test (Nota)</summary>
In the None of the Above (Nota) Test, the model is presented with a multiple-choice medical question where the correct answer is replaced by 'None of the above', requiring the model to identify this and justify its selection.
It tests the model's ability to distinguish irrelevant or incorrect information.
</details>
<details>
<summary>Fake Questions Test (FQT)</summary>
This test involves presenting the model with fake or nonsensical medical questions to examine whether it can correctly identify and handle such queries.
We employed a hybrid approach for generating fake questions, where a subset was crafted by human experts, while the remaining were generated using GPT-3.5.
</details>
#### Memory Hallucination Tests (MHTs)
<details>
<summary>Abstract-to-Link Test</summary>
Given the abstract of a PubMed article, the LLM is asked to generate the corresponding link to the article. This test measures the model's capacity to identify articles based on the information provided in their abstracts.
</details>
<details>
<summary>PMID-to-Title Test</summary>
In this test, the LLM is given the PubMed ID (PMID) of an article and is asked to generate the title of the article. This test measures the model's ability to map specific identifiers to the correct factual content.
</details>
<details>
<summary>Title-to-Link Test</summary>
Given the title of a PubMed article, the LLM is prompted to provide the PubMed link of the article. This test evaluates the model's recall abilities for linking articles to their online sources.
</details>
<details>
<summary>Link-to-Title Test</summary>
Similar to the previous one, in this test, we give the PubMed link of an article as input and ask the language model to provide the title as output. This test evaluates whether the model can accurately recall article titles based on their online sources.
</details>
## Citation
```
@article{Medhalt,
title={Med-HALT: Medical Domain Hallucination Test for Large Language Models},
author={Umapathi, Logesh Kumar and Pal, Ankit and Sankarasubbu, Malaikannan},
journal={arXiv preprint},
year={2023}
}
``` | [
-0.42214730381965637,
-1.0366133451461792,
0.7311859130859375,
0.09904585778713226,
-0.0410325825214386,
-0.153450146317482,
-0.05897511541843414,
-0.5768132209777832,
0.333416223526001,
0.42505839467048645,
-0.5323172211647034,
-0.3955709934234619,
-0.3611811399459839,
0.4309465289115906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_upstage__Llama-2-70b-instruct | open-llm-leaderboard | 2023-10-17T12:48:37Z | 70 | 0 | null | [
"region:us"
] | 2023-10-17T12:48:37Z | 2023-08-17T23:49:28.000Z | 2023-08-17T23:49:28 | ---
pretty_name: Evaluation run of upstage/Llama-2-70b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [upstage/Llama-2-70b-instruct](https://huggingface.co/upstage/Llama-2-70b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__Llama-2-70b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T12:48:24.237609](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__Llama-2-70b-instruct/blob/main/results_2023-10-17T12-48-24.237609.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.49989513422818793,\n\
\ \"em_stderr\": 0.005120467878578845,\n \"f1\": 0.5841736577181234,\n\
\ \"f1_stderr\": 0.004671177225967014,\n \"acc\": 0.5754715400500128,\n\
\ \"acc_stderr\": 0.011730426388075654\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.49989513422818793,\n \"em_stderr\": 0.005120467878578845,\n\
\ \"f1\": 0.5841736577181234,\n \"f1_stderr\": 0.004671177225967014\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32221379833206976,\n \
\ \"acc_stderr\": 0.01287243548118878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962526\n\
\ }\n}\n```"
repo_url: https://huggingface.co/upstage/Llama-2-70b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|arc:challenge|25_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T12_48_24.237609
path:
- '**/details_harness|drop|3_2023-10-17T12-48-24.237609.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T12-48-24.237609.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T12_48_24.237609
path:
- '**/details_harness|gsm8k|5_2023-10-17T12-48-24.237609.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T12-48-24.237609.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hellaswag|10_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T16:38:35.808290.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T16:38:35.808290.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T12_48_24.237609
path:
- '**/details_harness|winogrande|5_2023-10-17T12-48-24.237609.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T12-48-24.237609.parquet'
- config_name: results
data_files:
- split: 2023_07_31T16_38_35.808290
path:
- results_2023-07-31T16:38:35.808290.parquet
- split: 2023_10_17T12_48_24.237609
path:
- results_2023-10-17T12-48-24.237609.parquet
- split: latest
path:
- results_2023-10-17T12-48-24.237609.parquet
---
# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/Llama-2-70b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/Llama-2-70b-instruct](https://huggingface.co/upstage/Llama-2-70b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__Llama-2-70b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T12:48:24.237609](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__Llama-2-70b-instruct/blob/main/results_2023-10-17T12-48-24.237609.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.49989513422818793,
"em_stderr": 0.005120467878578845,
"f1": 0.5841736577181234,
"f1_stderr": 0.004671177225967014,
"acc": 0.5754715400500128,
"acc_stderr": 0.011730426388075654
},
"harness|drop|3": {
"em": 0.49989513422818793,
"em_stderr": 0.005120467878578845,
"f1": 0.5841736577181234,
"f1_stderr": 0.004671177225967014
},
"harness|gsm8k|5": {
"acc": 0.32221379833206976,
"acc_stderr": 0.01287243548118878
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962526
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.3484545350074768,
-0.541126012802124,
0.27851322293281555,
0.28648850321769714,
-0.22064091265201569,
0.26406195759773254,
-0.35914385318756104,
-0.143694207072258,
0.4603451192378998,
0.5826771855354309,
-0.7486016154289246,
-0.8663275837898254,
-0.7057148218154907,
0.20391888916492462... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B | open-llm-leaderboard | 2023-10-16T06:37:18Z | 70 | 0 | null | [
"region:us"
] | 2023-10-16T06:37:18Z | 2023-08-18T18:47:01.000Z | 2023-08-18T18:47:01 | ---
pretty_name: Evaluation run of garage-bAInd/Camel-Platypus2-70B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [garage-bAInd/Camel-Platypus2-70B](https://huggingface.co/garage-bAInd/Camel-Platypus2-70B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T06:37:05.018958](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B/blob/main/results_2023-10-16T06-37-05.018958.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5069211409395973,\n\
\ \"em_stderr\": 0.0051199774044148345,\n \"f1\": 0.559724203020135,\n\
\ \"f1_stderr\": 0.004829732229468497,\n \"acc\": 0.5345469918434537,\n\
\ \"acc_stderr\": 0.01116294273345166\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5069211409395973,\n \"em_stderr\": 0.0051199774044148345,\n\
\ \"f1\": 0.559724203020135,\n \"f1_stderr\": 0.004829732229468497\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2395754359363154,\n \
\ \"acc_stderr\": 0.01175686434407741\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825909\n\
\ }\n}\n```"
repo_url: https://huggingface.co/garage-bAInd/Camel-Platypus2-70B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|drop|3_2023-09-23T09-15-03.498663.parquet'
- split: 2023_10_16T06_37_05.018958
path:
- '**/details_harness|drop|3_2023-10-16T06-37-05.018958.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T06-37-05.018958.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|gsm8k|5_2023-09-23T09-15-03.498663.parquet'
- split: 2023_10_16T06_37_05.018958
path:
- '**/details_harness|gsm8k|5_2023-10-16T06-37-05.018958.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T06-37-05.018958.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:04:49.359575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T00:04:49.359575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T09_15_03.498663
path:
- '**/details_harness|winogrande|5_2023-09-23T09-15-03.498663.parquet'
- split: 2023_10_16T06_37_05.018958
path:
- '**/details_harness|winogrande|5_2023-10-16T06-37-05.018958.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T06-37-05.018958.parquet'
- config_name: results
data_files:
- split: 2023_08_18T00_04_49.359575
path:
- results_2023-08-18T00:04:49.359575.parquet
- split: 2023_09_23T09_15_03.498663
path:
- results_2023-09-23T09-15-03.498663.parquet
- split: 2023_10_16T06_37_05.018958
path:
- results_2023-10-16T06-37-05.018958.parquet
- split: latest
path:
- results_2023-10-16T06-37-05.018958.parquet
---
# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/garage-bAInd/Camel-Platypus2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [garage-bAInd/Camel-Platypus2-70B](https://huggingface.co/garage-bAInd/Camel-Platypus2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T06:37:05.018958](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B/blob/main/results_2023-10-16T06-37-05.018958.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5069211409395973,
"em_stderr": 0.0051199774044148345,
"f1": 0.559724203020135,
"f1_stderr": 0.004829732229468497,
"acc": 0.5345469918434537,
"acc_stderr": 0.01116294273345166
},
"harness|drop|3": {
"em": 0.5069211409395973,
"em_stderr": 0.0051199774044148345,
"f1": 0.559724203020135,
"f1_stderr": 0.004829732229468497
},
"harness|gsm8k|5": {
"acc": 0.2395754359363154,
"acc_stderr": 0.01175686434407741
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825909
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.4238039553165436,
-0.6196473836898804,
0.07764719426631927,
0.283314973115921,
-0.21443523466587067,
0.18613506853580475,
-0.40018218755722046,
-0.1482400745153427,
0.31262126564979553,
0.5079782009124756,
-0.7246641516685486,
-0.975869357585907,
-0.5926439166069031,
0.1530662178993225,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_psmathur__orca_mini_v3_70b | open-llm-leaderboard | 2023-08-27T12:40:41Z | 70 | 1 | null | [
"region:us"
] | 2023-08-27T12:40:41Z | 2023-08-18T18:47:52.000Z | 2023-08-18T18:47:52 | ---
pretty_name: Evaluation run of psmathur/orca_mini_v3_70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/orca_mini_v3_70b](https://huggingface.co/psmathur/orca_mini_v3_70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v3_70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-18T01:37:34.029105](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_70b/blob/main/results_2023-08-18T01%3A37%3A34.029105.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7010508529623596,\n\
\ \"acc_stderr\": 0.0309286120388273,\n \"acc_norm\": 0.7049679984523141,\n\
\ \"acc_norm_stderr\": 0.030896356315399304,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n\
\ \"acc_stderr\": 0.00459390260197934,\n \"acc_norm\": 0.8785102569209321,\n\
\ \"acc_norm_stderr\": 0.0032602788112468337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708052,\n\
\ \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026783,\n \"\
acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026783\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"\
acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519513,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305733,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305733\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.0218552552634218,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.0218552552634218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n\
\ \"acc_stderr\": 0.01661139368726857,\n \"acc_norm\": 0.5575418994413408,\n\
\ \"acc_norm_stderr\": 0.01661139368726857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.021330868762127062,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.021330868762127062\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5851063829787234,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559973924380704,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.559973924380704,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427653,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427653\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.6126968953087459,\n\
\ \"mc2_stderr\": 0.015087648780065216\n }\n}\n```"
repo_url: https://huggingface.co/psmathur/orca_mini_v3_70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:37:34.029105.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:37:34.029105.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:37:34.029105.parquet'
- config_name: results
data_files:
- split: 2023_08_18T01_37_34.029105
path:
- results_2023-08-18T01:37:34.029105.parquet
- split: latest
path:
- results_2023-08-18T01:37:34.029105.parquet
---
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v3_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_70b](https://huggingface.co/psmathur/orca_mini_v3_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v3_70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-18T01:37:34.029105](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_70b/blob/main/results_2023-08-18T01%3A37%3A34.029105.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7010508529623596,
"acc_stderr": 0.0309286120388273,
"acc_norm": 0.7049679984523141,
"acc_norm_stderr": 0.030896356315399304,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.013796182947785562,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.6951802429794861,
"acc_stderr": 0.00459390260197934,
"acc_norm": 0.8785102569209321,
"acc_norm_stderr": 0.0032602788112468337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708052,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026783,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026783
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5665024630541872,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.5665024630541872,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.01280978008187893,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.01280978008187893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640255,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519513,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305733,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305733
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5575418994413408,
"acc_stderr": 0.01661139368726857,
"acc_norm": 0.5575418994413408,
"acc_norm_stderr": 0.01661139368726857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.021330868762127062,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.021330868762127062
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.559973924380704,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.559973924380704,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427653,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427653
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.6126968953087459,
"mc2_stderr": 0.015087648780065216
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7245686650276184,
-0.8326889276504517,
0.2742481827735901,
0.18189240992069244,
-0.20680344104766846,
-0.060645412653684616,
0.03179856762290001,
-0.21050505340099335,
0.5985257625579834,
-0.05202775448560715,
-0.5200089812278748,
-0.6923322677612305,
-0.4586913287639618,
0.235862240195... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA | open-llm-leaderboard | 2023-10-13T16:29:55Z | 70 | 0 | null | [
"region:us"
] | 2023-10-13T16:29:55Z | 2023-08-18T18:48:40.000Z | 2023-08-18T18:48:40 | ---
pretty_name: Evaluation run of v2ray/LLaMA-2-Wizard-70B-QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [v2ray/LLaMA-2-Wizard-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T16:29:43.203362](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA/blob/main/results_2023-10-13T16-29-43.203362.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5358640939597316,\n\
\ \"em_stderr\": 0.005107278772685844,\n \"f1\": 0.5902537751677871,\n\
\ \"f1_stderr\": 0.004795935527255125,\n \"acc\": 0.5639903828029773,\n\
\ \"acc_stderr\": 0.011700610418717068\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5358640939597316,\n \"em_stderr\": 0.005107278772685844,\n\
\ \"f1\": 0.5902537751677871,\n \"f1_stderr\": 0.004795935527255125\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30477634571645185,\n \
\ \"acc_stderr\": 0.012679297549515413\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918725\n\
\ }\n}\n```"
repo_url: https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T16_29_43.203362
path:
- '**/details_harness|drop|3_2023-10-13T16-29-43.203362.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T16-29-43.203362.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T16_29_43.203362
path:
- '**/details_harness|gsm8k|5_2023-10-13T16-29-43.203362.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T16-29-43.203362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:09:43.451689.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:09:43.451689.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T16_29_43.203362
path:
- '**/details_harness|winogrande|5_2023-10-13T16-29-43.203362.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T16-29-43.203362.parquet'
- config_name: results
data_files:
- split: 2023_08_18T07_09_43.451689
path:
- results_2023-08-18T07:09:43.451689.parquet
- split: 2023_10_13T16_29_43.203362
path:
- results_2023-10-13T16-29-43.203362.parquet
- split: latest
path:
- results_2023-10-13T16-29-43.203362.parquet
---
# Dataset Card for Evaluation run of v2ray/LLaMA-2-Wizard-70B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [v2ray/LLaMA-2-Wizard-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Wizard-70B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T16:29:43.203362](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Wizard-70B-QLoRA/blob/main/results_2023-10-13T16-29-43.203362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5358640939597316,
"em_stderr": 0.005107278772685844,
"f1": 0.5902537751677871,
"f1_stderr": 0.004795935527255125,
"acc": 0.5639903828029773,
"acc_stderr": 0.011700610418717068
},
"harness|drop|3": {
"em": 0.5358640939597316,
"em_stderr": 0.005107278772685844,
"f1": 0.5902537751677871,
"f1_stderr": 0.004795935527255125
},
"harness|gsm8k|5": {
"acc": 0.30477634571645185,
"acc_stderr": 0.012679297549515413
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918725
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.3013244867324829,
-0.5584701895713806,
0.20059582591056824,
0.23090344667434692,
-0.1638515591621399,
0.18255071341991425,
-0.24625520408153534,
-0.20976419746875763,
0.38631048798561096,
0.5657312273979187,
-0.6408225297927856,
-0.8983409404754639,
-0.7060911655426025,
0.10193300247192... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_ehartford__Samantha-1.11-70b | open-llm-leaderboard | 2023-10-19T17:03:07Z | 70 | 0 | null | [
"region:us"
] | 2023-10-19T17:03:07Z | 2023-08-27T11:54:38.000Z | 2023-08-27T11:54:38 | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-70b](https://huggingface.co/ehartford/Samantha-1.11-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T17:02:54.174662](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-70b/blob/main/results_2023-10-19T17-02-54.174662.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5320889261744967,\n\
\ \"em_stderr\": 0.0051099120270992685,\n \"f1\": 0.5767973993288609,\n\
\ \"f1_stderr\": 0.004860619911447506,\n \"acc\": 0.5660724533007654,\n\
\ \"acc_stderr\": 0.011553454771173869\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5320889261744967,\n \"em_stderr\": 0.0051099120270992685,\n\
\ \"f1\": 0.5767973993288609,\n \"f1_stderr\": 0.004860619911447506\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29946929492039426,\n \
\ \"acc_stderr\": 0.012616300735519658\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828079\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|arc:challenge|25_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T17_02_54.174662
path:
- '**/details_harness|drop|3_2023-10-19T17-02-54.174662.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T17-02-54.174662.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T17_02_54.174662
path:
- '**/details_harness|gsm8k|5_2023-10-19T17-02-54.174662.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T17-02-54.174662.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hellaswag|10_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T18_30_58.468070
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T18:30:58.468070.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T18:30:58.468070.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T17_02_54.174662
path:
- '**/details_harness|winogrande|5_2023-10-19T17-02-54.174662.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T17-02-54.174662.parquet'
- config_name: results
data_files:
- split: 2023_10_19T17_02_54.174662
path:
- results_2023-10-19T17-02-54.174662.parquet
- split: latest
path:
- results_2023-10-19T17-02-54.174662.parquet
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-70b](https://huggingface.co/ehartford/Samantha-1.11-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T17:02:54.174662](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-70b/blob/main/results_2023-10-19T17-02-54.174662.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5320889261744967,
"em_stderr": 0.0051099120270992685,
"f1": 0.5767973993288609,
"f1_stderr": 0.004860619911447506,
"acc": 0.5660724533007654,
"acc_stderr": 0.011553454771173869
},
"harness|drop|3": {
"em": 0.5320889261744967,
"em_stderr": 0.0051099120270992685,
"f1": 0.5767973993288609,
"f1_stderr": 0.004860619911447506
},
"harness|gsm8k|5": {
"acc": 0.29946929492039426,
"acc_stderr": 0.012616300735519658
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828079
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.3227522075176239,
-0.6317293047904968,
0.30467164516448975,
0.12195724993944168,
-0.15660521388053894,
0.06556470692157745,
-0.2874852120876312,
-0.17786747217178345,
0.4493611752986908,
0.5316149592399597,
-0.7403085231781006,
-1.0041416883468628,
-0.6766968369483948,
0.197227522730827... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_chargoddard__MelangeA-70b | open-llm-leaderboard | 2023-10-17T19:47:20Z | 70 | 0 | null | [
"region:us"
] | 2023-10-17T19:47:20Z | 2023-08-27T12:13:26.000Z | 2023-08-27T12:13:26 | ---
pretty_name: Evaluation run of chargoddard/MelangeA-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/MelangeA-70b](https://huggingface.co/chargoddard/MelangeA-70b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MelangeA-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T19:47:08.035007](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeA-70b/blob/main/results_2023-10-17T19-47-08.035007.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.030306208053691275,\n\
\ \"em_stderr\": 0.0017555886284412359,\n \"f1\": 0.14531145134227982,\n\
\ \"f1_stderr\": 0.0023604588930624115,\n \"acc\": 0.43608650929616505,\n\
\ \"acc_stderr\": 0.008642384177128263\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.030306208053691275,\n \"em_stderr\": 0.0017555886284412359,\n\
\ \"f1\": 0.14531145134227982,\n \"f1_stderr\": 0.0023604588930624115\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05686125852918878,\n \
\ \"acc_stderr\": 0.006378790242099637\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156888\n\
\ }\n}\n```"
repo_url: https://huggingface.co/chargoddard/MelangeA-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|arc:challenge|25_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T19_47_08.035007
path:
- '**/details_harness|drop|3_2023-10-17T19-47-08.035007.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T19-47-08.035007.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T19_47_08.035007
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-47-08.035007.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-47-08.035007.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hellaswag|10_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T13_15_46.123810
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T13:15:46.123810.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T13:15:46.123810.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T19_47_08.035007
path:
- '**/details_harness|winogrande|5_2023-10-17T19-47-08.035007.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T19-47-08.035007.parquet'
- config_name: results
data_files:
- split: 2023_10_17T19_47_08.035007
path:
- results_2023-10-17T19-47-08.035007.parquet
- split: latest
path:
- results_2023-10-17T19-47-08.035007.parquet
---
# Dataset Card for Evaluation run of chargoddard/MelangeA-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/MelangeA-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/MelangeA-70b](https://huggingface.co/chargoddard/MelangeA-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__MelangeA-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T19:47:08.035007](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeA-70b/blob/main/results_2023-10-17T19-47-08.035007.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.030306208053691275,
"em_stderr": 0.0017555886284412359,
"f1": 0.14531145134227982,
"f1_stderr": 0.0023604588930624115,
"acc": 0.43608650929616505,
"acc_stderr": 0.008642384177128263
},
"harness|drop|3": {
"em": 0.030306208053691275,
"em_stderr": 0.0017555886284412359,
"f1": 0.14531145134227982,
"f1_stderr": 0.0023604588930624115
},
"harness|gsm8k|5": {
"acc": 0.05686125852918878,
"acc_stderr": 0.006378790242099637
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156888
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.3964373767375946,
-0.6605874300003052,
0.20213715732097626,
0.22276920080184937,
-0.17597182095050812,
0.11941242218017578,
-0.38481301069259644,
-0.17907604575157166,
0.4399597942829132,
0.6139347553253174,
-0.7190521955490112,
-1.0146411657333374,
-0.6446176171302795,
0.16504469513893... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_chargoddard__MelangeB-70b | open-llm-leaderboard | 2023-10-17T13:18:18Z | 70 | 0 | null | [
"region:us"
] | 2023-10-17T13:18:18Z | 2023-08-27T12:13:44.000Z | 2023-08-27T12:13:44 | ---
pretty_name: Evaluation run of chargoddard/MelangeB-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/MelangeB-70b](https://huggingface.co/chargoddard/MelangeB-70b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MelangeB-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T13:18:04.928943](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeB-70b/blob/main/results_2023-10-17T13-18-04.928943.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.49958053691275167,\n\
\ \"em_stderr\": 0.005120466189311586,\n \"f1\": 0.5792397231543648,\n\
\ \"f1_stderr\": 0.004704767839498484,\n \"acc\": 0.570668027786471,\n\
\ \"acc_stderr\": 0.01156392378740017\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.49958053691275167,\n \"em_stderr\": 0.005120466189311586,\n\
\ \"f1\": 0.5792397231543648,\n \"f1_stderr\": 0.004704767839498484\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3062926459438969,\n \
\ \"acc_stderr\": 0.0126969301065629\n },\n \"harness|winogrande|5\": {\n\
\ \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237438\n\
\ }\n}\n```"
repo_url: https://huggingface.co/chargoddard/MelangeB-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T13_18_04.928943
path:
- '**/details_harness|drop|3_2023-10-17T13-18-04.928943.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T13-18-04.928943.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T13_18_04.928943
path:
- '**/details_harness|gsm8k|5_2023-10-17T13-18-04.928943.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T13-18-04.928943.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T14_27_52.893839
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:27:52.893839.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T14:27:52.893839.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T13_18_04.928943
path:
- '**/details_harness|winogrande|5_2023-10-17T13-18-04.928943.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T13-18-04.928943.parquet'
- config_name: results
data_files:
- split: 2023_10_17T13_18_04.928943
path:
- results_2023-10-17T13-18-04.928943.parquet
- split: latest
path:
- results_2023-10-17T13-18-04.928943.parquet
---
# Dataset Card for Evaluation run of chargoddard/MelangeB-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/MelangeB-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/MelangeB-70b](https://huggingface.co/chargoddard/MelangeB-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__MelangeB-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T13:18:04.928943](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeB-70b/blob/main/results_2023-10-17T13-18-04.928943.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.49958053691275167,
"em_stderr": 0.005120466189311586,
"f1": 0.5792397231543648,
"f1_stderr": 0.004704767839498484,
"acc": 0.570668027786471,
"acc_stderr": 0.01156392378740017
},
"harness|drop|3": {
"em": 0.49958053691275167,
"em_stderr": 0.005120466189311586,
"f1": 0.5792397231543648,
"f1_stderr": 0.004704767839498484
},
"harness|gsm8k|5": {
"acc": 0.3062926459438969,
"acc_stderr": 0.0126969301065629
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237438
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.41027966141700745,
-0.6648197770118713,
0.18241071701049805,
0.23080560564994812,
-0.16295823454856873,
0.13906429708003998,
-0.4110400676727295,
-0.17514429986476898,
0.41625845432281494,
0.629570484161377,
-0.7445787191390991,
-1.0494965314865112,
-0.650931715965271,
0.173380106687545... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_chargoddard__MelangeC-70b | open-llm-leaderboard | 2023-09-23T03:39:28Z | 70 | 0 | null | [
"region:us"
] | 2023-09-23T03:39:28Z | 2023-08-27T12:13:53.000Z | 2023-08-27T12:13:53 | ---
pretty_name: Evaluation run of chargoddard/MelangeC-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/MelangeC-70b](https://huggingface.co/chargoddard/MelangeC-70b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__MelangeC-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T03:39:16.431965](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeC-70b/blob/main/results_2023-09-23T03-39-16.431965.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.489618288590604,\n\
\ \"em_stderr\": 0.005119364104825758,\n \"f1\": 0.5680631291946334,\n\
\ \"f1_stderr\": 0.004723246870166152,\n \"acc\": 0.4198895027624309,\n\
\ \"acc_stderr\": 0.005154604749093739\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.489618288590604,\n \"em_stderr\": 0.005119364104825758,\n\
\ \"f1\": 0.5680631291946334,\n \"f1_stderr\": 0.004723246870166152\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8397790055248618,\n\
\ \"acc_stderr\": 0.010309209498187479\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/MelangeC-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|arc:challenge|25_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- '**/details_harness|drop|3_2023-09-23T03-39-16.431965.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T03-39-16.431965.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- '**/details_harness|gsm8k|5_2023-09-23T03-39-16.431965.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T03-39-16.431965.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hellaswag|10_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T15_40_38.458774
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T15:40:38.458774.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T15:40:38.458774.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- '**/details_harness|winogrande|5_2023-09-23T03-39-16.431965.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T03-39-16.431965.parquet'
- config_name: results
data_files:
- split: 2023_09_23T03_39_16.431965
path:
- results_2023-09-23T03-39-16.431965.parquet
- split: latest
path:
- results_2023-09-23T03-39-16.431965.parquet
---
# Dataset Card for Evaluation run of chargoddard/MelangeC-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/MelangeC-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/MelangeC-70b](https://huggingface.co/chargoddard/MelangeC-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__MelangeC-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T03:39:16.431965](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__MelangeC-70b/blob/main/results_2023-09-23T03-39-16.431965.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.489618288590604,
"em_stderr": 0.005119364104825758,
"f1": 0.5680631291946334,
"f1_stderr": 0.004723246870166152,
"acc": 0.4198895027624309,
"acc_stderr": 0.005154604749093739
},
"harness|drop|3": {
"em": 0.489618288590604,
"em_stderr": 0.005119364104825758,
"f1": 0.5680631291946334,
"f1_stderr": 0.004723246870166152
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.4056438207626343,
-0.6244279742240906,
0.17588967084884644,
0.19051504135131836,
-0.16792897880077362,
0.12142109870910645,
-0.43382999300956726,
-0.14914363622665405,
0.4089798331260681,
0.6315597295761108,
-0.7658685445785522,
-1.106701135635376,
-0.6519061326980591,
0.211404472589492... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2 | open-llm-leaderboard | 2023-08-31T19:45:39Z | 70 | 0 | null | [
"region:us"
] | 2023-08-31T19:45:39Z | 2023-08-31T19:44:40.000Z | 2023-08-31T19:44:40 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v2](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T19:44:15.918763](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2/blob/main/results_2023-08-31T19%3A44%3A15.918763.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6793562660993889,\n\
\ \"acc_stderr\": 0.03184581364444873,\n \"acc_norm\": 0.6834576899716158,\n\
\ \"acc_norm_stderr\": 0.03181820263146339,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6450685339919277,\n\
\ \"mc2_stderr\": 0.015210507246763325\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.013936809212158303,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688067\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6491734714200359,\n\
\ \"acc_stderr\": 0.004762534245488399,\n \"acc_norm\": 0.8536148177653854,\n\
\ \"acc_norm_stderr\": 0.003527695149823521\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899095,\n\
\ \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6553191489361702,\n \"acc_stderr\": 0.031068985963122145,\n\
\ \"acc_norm\": 0.6553191489361702,\n \"acc_norm_stderr\": 0.031068985963122145\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.022037217340267833,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.022037217340267833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7563025210084033,\n \"acc_stderr\": 0.02788682807838055,\n \
\ \"acc_norm\": 0.7563025210084033,\n \"acc_norm_stderr\": 0.02788682807838055\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634612,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634612\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8825688073394495,\n \"acc_stderr\": 0.013802780227377355,\n \"\
acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.013802780227377355\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\"\
: 0.9019607843137255,\n \"acc_norm_stderr\": 0.0208711184555521\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \"\
acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n\
\ \"acc_stderr\": 0.028930413120910888,\n \"acc_norm\": 0.7533632286995515,\n\
\ \"acc_norm_stderr\": 0.028930413120910888\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8582375478927203,\n\
\ \"acc_stderr\": 0.012473289071272051,\n \"acc_norm\": 0.8582375478927203,\n\
\ \"acc_norm_stderr\": 0.012473289071272051\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5463687150837989,\n\
\ \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.5463687150837989,\n\
\ \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.02429659403476343,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.02429659403476343\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445796,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445796\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5501955671447197,\n\
\ \"acc_stderr\": 0.012705721498564969,\n \"acc_norm\": 0.5501955671447197,\n\
\ \"acc_norm_stderr\": 0.012705721498564969\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7287581699346405,\n \"acc_stderr\": 0.017986615304030316,\n \
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.017986615304030316\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073153,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073153\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6450685339919277,\n\
\ \"mc2_stderr\": 0.015210507246763325\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|arc:challenge|25_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hellaswag|10_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T19:44:15.918763.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T19:44:15.918763.parquet'
- config_name: results
data_files:
- split: 2023_08_31T19_44_15.918763
path:
- results_2023-08-31T19:44:15.918763.parquet
- split: latest
path:
- results_2023-08-31T19:44:15.918763.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v2](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T19:44:15.918763](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2/blob/main/results_2023-08-31T19%3A44%3A15.918763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6793562660993889,
"acc_stderr": 0.03184581364444873,
"acc_norm": 0.6834576899716158,
"acc_norm_stderr": 0.03181820263146339,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6450685339919277,
"mc2_stderr": 0.015210507246763325
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.013936809212158303,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688067
},
"harness|hellaswag|10": {
"acc": 0.6491734714200359,
"acc_stderr": 0.004762534245488399,
"acc_norm": 0.8536148177653854,
"acc_norm_stderr": 0.003527695149823521
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899095,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6553191489361702,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.6553191489361702,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7563025210084033,
"acc_stderr": 0.02788682807838055,
"acc_norm": 0.7563025210084033,
"acc_norm_stderr": 0.02788682807838055
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634612,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.013802780227377355,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.013802780227377355
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910888,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910888
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8582375478927203,
"acc_stderr": 0.012473289071272051,
"acc_norm": 0.8582375478927203,
"acc_norm_stderr": 0.012473289071272051
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5463687150837989,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.5463687150837989,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.02429659403476343,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.02429659403476343
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445796,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5501955671447197,
"acc_stderr": 0.012705721498564969,
"acc_norm": 0.5501955671447197,
"acc_norm_stderr": 0.012705721498564969
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.017986615304030316,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.017986615304030316
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6450685339919277,
"mc2_stderr": 0.015210507246763325
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7115777730941772,
-0.8211941123008728,
0.2906646430492401,
0.17498493194580078,
-0.19221700727939606,
-0.03820694983005524,
0.025604721158742905,
-0.2883223295211792,
0.5779737830162048,
-0.043930232524871826,
-0.5163767337799072,
-0.6639897227287292,
-0.4370048940181732,
0.217697471380... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3 | open-llm-leaderboard | 2023-09-01T14:03:22Z | 70 | 0 | null | [
"region:us"
] | 2023-09-01T14:03:22Z | 2023-09-01T14:02:23.000Z | 2023-09-01T14:02:23 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T14:01:58.848407](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3/blob/main/results_2023-09-01T14%3A01%3A58.848407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6813782482106774,\n\
\ \"acc_stderr\": 0.03171011741691581,\n \"acc_norm\": 0.6847848607826429,\n\
\ \"acc_norm_stderr\": 0.031684498624315015,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6421820394674438,\n\
\ \"mc2_stderr\": 0.015085186356964665\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283504,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6936865166301533,\n\
\ \"acc_stderr\": 0.004600194559865542,\n \"acc_norm\": 0.8716391157140012,\n\
\ \"acc_norm_stderr\": 0.003338076015617253\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.02619980880756192,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.02619980880756192\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n\
\ \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n\
\ \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n\
\ \"acc_stderr\": 0.0314895582974553,\n \"acc_norm\": 0.6340425531914894,\n\
\ \"acc_norm_stderr\": 0.0314895582974553\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04644602091222318,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04644602091222318\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"\
acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48412698412698413,\n \"acc_stderr\": 0.025738330639412152,\n \"\
acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.025738330639412152\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329286,\n \"\
acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329286\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652459,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652459\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279476,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279476\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588949,\n \"\
acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588949\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426998,\n \"\
acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426998\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486884,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486884\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990915,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990915\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8607918263090677,\n\
\ \"acc_stderr\": 0.012378786101885145,\n \"acc_norm\": 0.8607918263090677,\n\
\ \"acc_norm_stderr\": 0.012378786101885145\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5787709497206703,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.5787709497206703,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.752411575562701,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7993827160493827,\n \"acc_stderr\": 0.02228231394977488,\n\
\ \"acc_norm\": 0.7993827160493827,\n \"acc_norm_stderr\": 0.02228231394977488\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5709219858156028,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.5709219858156028,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n\
\ \"acc_stderr\": 0.012663412101248349,\n \"acc_norm\": 0.5645371577574967,\n\
\ \"acc_norm_stderr\": 0.012663412101248349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7336601307189542,\n \"acc_stderr\": 0.017883188134667206,\n \
\ \"acc_norm\": 0.7336601307189542,\n \"acc_norm_stderr\": 0.017883188134667206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900794,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900794\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6421820394674438,\n\
\ \"mc2_stderr\": 0.015085186356964665\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|arc:challenge|25_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hellaswag|10_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T14:01:58.848407.parquet'
- config_name: results
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- results_2023-09-01T14:01:58.848407.parquet
- split: latest
path:
- results_2023-09-01T14:01:58.848407.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T14:01:58.848407](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3/blob/main/results_2023-09-01T14%3A01%3A58.848407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6813782482106774,
"acc_stderr": 0.03171011741691581,
"acc_norm": 0.6847848607826429,
"acc_norm_stderr": 0.031684498624315015,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6421820394674438,
"mc2_stderr": 0.015085186356964665
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283504,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.013572657703084948
},
"harness|hellaswag|10": {
"acc": 0.6936865166301533,
"acc_stderr": 0.004600194559865542,
"acc_norm": 0.8716391157140012,
"acc_norm_stderr": 0.003338076015617253
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.02619980880756192,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.02619980880756192
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.025738330639412152,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.025738330639412152
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603918,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652459,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652459
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279476,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279476
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588949,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588949
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426998,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426998
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486884,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486884
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383602,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383602
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990915,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990915
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8607918263090677,
"acc_stderr": 0.012378786101885145,
"acc_norm": 0.8607918263090677,
"acc_norm_stderr": 0.012378786101885145
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5787709497206703,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.5787709497206703,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7993827160493827,
"acc_stderr": 0.02228231394977488,
"acc_norm": 0.7993827160493827,
"acc_norm_stderr": 0.02228231394977488
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5709219858156028,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.5709219858156028,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248349,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7336601307189542,
"acc_stderr": 0.017883188134667206,
"acc_norm": 0.7336601307189542,
"acc_norm_stderr": 0.017883188134667206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900794,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900794
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6421820394674438,
"mc2_stderr": 0.015085186356964665
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7232826948165894,
-0.841739296913147,
0.28847408294677734,
0.18421407043933868,
-0.1852612942457199,
-0.04772141948342323,
0.023442121222615242,
-0.27877214550971985,
0.5717443823814392,
-0.053826019167900085,
-0.5267003774642944,
-0.6734060049057007,
-0.42145243287086487,
0.23077011108... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v4 | open-llm-leaderboard | 2023-09-02T05:23:56Z | 70 | 0 | null | [
"region:us"
] | 2023-09-02T05:23:56Z | 2023-09-02T05:22:59.000Z | 2023-09-02T05:22:59 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v4](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T05:22:36.145219](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v4/blob/main/results_2023-09-02T05%3A22%3A36.145219.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6965991479686341,\n\
\ \"acc_stderr\": 0.03128858284011723,\n \"acc_norm\": 0.7002900066329676,\n\
\ \"acc_norm_stderr\": 0.03126026366396146,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6260206771095533,\n\
\ \"mc2_stderr\": 0.014926739687315194\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173302,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.01327307786590759\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6838279227245568,\n\
\ \"acc_stderr\": 0.0046403067196280675,\n \"acc_norm\": 0.8734315873332006,\n\
\ \"acc_norm_stderr\": 0.00331809357970292\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.033911609343436025,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.033911609343436025\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.026880647889051982,\n\
\ \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.026880647889051982\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n\
\ \"acc_stderr\": 0.02951424596429177,\n \"acc_norm\": 0.8541666666666666,\n\
\ \"acc_norm_stderr\": 0.02951424596429177\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.03078373675774565,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.03078373675774565\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154954,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154954\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329276,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329276\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865376,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865376\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880236,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880236\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776826,\n\
\ \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776826\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652459,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652459\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02626502460827588,\n \
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02626502460827588\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080437,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n\
\ \"acc_stderr\": 0.025998379092356513,\n \"acc_norm\": 0.8161434977578476,\n\
\ \"acc_norm_stderr\": 0.025998379092356513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.031722334260021585,\n \"\
acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.031722334260021585\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6502793296089385,\n\
\ \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.6502793296089385,\n\
\ \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958154,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958154\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\
\ \"acc_stderr\": 0.024406162094668893,\n \"acc_norm\": 0.7556270096463023,\n\
\ \"acc_norm_stderr\": 0.024406162094668893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225163,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.029555454236778845,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.029555454236778845\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5919165580182529,\n\
\ \"acc_stderr\": 0.012552598958563671,\n \"acc_norm\": 0.5919165580182529,\n\
\ \"acc_norm_stderr\": 0.012552598958563671\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274054,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274054\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7516339869281046,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070813,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070813\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6260206771095533,\n\
\ \"mc2_stderr\": 0.014926739687315194\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|arc:challenge|25_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hellaswag|10_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T05:22:36.145219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T05:22:36.145219.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T05:22:36.145219.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T05:22:36.145219.parquet'
- config_name: results
data_files:
- split: 2023_09_02T05_22_36.145219
path:
- results_2023-09-02T05:22:36.145219.parquet
- split: latest
path:
- results_2023-09-02T05:22:36.145219.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v4](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T05:22:36.145219](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v4/blob/main/results_2023-09-02T05%3A22%3A36.145219.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6965991479686341,
"acc_stderr": 0.03128858284011723,
"acc_norm": 0.7002900066329676,
"acc_norm_stderr": 0.03126026366396146,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6260206771095533,
"mc2_stderr": 0.014926739687315194
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173302,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.01327307786590759
},
"harness|hellaswag|10": {
"acc": 0.6838279227245568,
"acc_stderr": 0.0046403067196280675,
"acc_norm": 0.8734315873332006,
"acc_norm_stderr": 0.00331809357970292
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.026880647889051982,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.026880647889051982
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.02951424596429177,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.02951424596429177
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.03078373675774565,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.03078373675774565
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154954,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154954
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329276,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329276
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865376,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865376
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880236,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880236
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776826,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776826
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652459,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652459
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.02626502460827588,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.02626502460827588
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080437,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868834,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356513,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.031722334260021585,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.031722334260021585
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6502793296089385,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.6502793296089385,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958154,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958154
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.024406162094668893,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.024406162094668893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225163,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.029555454236778845,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.029555454236778845
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5919165580182529,
"acc_stderr": 0.012552598958563671,
"acc_norm": 0.5919165580182529,
"acc_norm_stderr": 0.012552598958563671
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274054,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274054
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070813,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070813
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6260206771095533,
"mc2_stderr": 0.014926739687315194
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7192479372024536,
-0.8288736939430237,
0.3032150864601135,
0.18406380712985992,
-0.2026311606168747,
-0.03885602205991745,
0.020864320918917656,
-0.28039538860321045,
0.5721837282180786,
-0.04565555602312088,
-0.5242899656295776,
-0.6977967023849487,
-0.44038620591163635,
0.230094239115... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2 | open-llm-leaderboard | 2023-10-17T14:51:33Z | 70 | 0 | null | [
"region:us"
] | 2023-10-17T14:51:33Z | 2023-09-02T17:59:24.000Z | 2023-09-02T17:59:24 | ---
pretty_name: Evaluation run of migtissera/Synthia-70B-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B-v1.2](https://huggingface.co/migtissera/Synthia-70B-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T14:51:20.480254](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2/blob/main/results_2023-10-17T14-51-20.480254.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.364618288590604,\n\
\ \"em_stderr\": 0.004929197624393639,\n \"f1\": 0.42417365771812215,\n\
\ \"f1_stderr\": 0.004776577842624861,\n \"acc\": 0.5759284047791582,\n\
\ \"acc_stderr\": 0.011665477241539865\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.364618288590604,\n \"em_stderr\": 0.004929197624393639,\n\
\ \"f1\": 0.42417365771812215,\n \"f1_stderr\": 0.004776577842624861\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3191811978771797,\n \
\ \"acc_stderr\": 0.012840345676251651\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828079\n\
\ }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T14_51_20.480254
path:
- '**/details_harness|drop|3_2023-10-17T14-51-20.480254.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T14-51-20.480254.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T14_51_20.480254
path:
- '**/details_harness|gsm8k|5_2023-10-17T14-51-20.480254.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T14-51-20.480254.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:59:05.420313.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:59:05.420313.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T14_51_20.480254
path:
- '**/details_harness|winogrande|5_2023-10-17T14-51-20.480254.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T14-51-20.480254.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_59_05.420313
path:
- results_2023-09-02T17:59:05.420313.parquet
- split: 2023_10_17T14_51_20.480254
path:
- results_2023-10-17T14-51-20.480254.parquet
- split: latest
path:
- results_2023-10-17T14-51-20.480254.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B-v1.2](https://huggingface.co/migtissera/Synthia-70B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T14:51:20.480254](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2/blob/main/results_2023-10-17T14-51-20.480254.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.364618288590604,
"em_stderr": 0.004929197624393639,
"f1": 0.42417365771812215,
"f1_stderr": 0.004776577842624861,
"acc": 0.5759284047791582,
"acc_stderr": 0.011665477241539865
},
"harness|drop|3": {
"em": 0.364618288590604,
"em_stderr": 0.004929197624393639,
"f1": 0.42417365771812215,
"f1_stderr": 0.004776577842624861
},
"harness|gsm8k|5": {
"acc": 0.3191811978771797,
"acc_stderr": 0.012840345676251651
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828079
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.44495251774787903,
-0.6435449123382568,
0.2918092906475067,
0.22593049705028534,
-0.1926969289779663,
0.12396363914012909,
-0.3841487765312195,
-0.20379707217216492,
0.45176562666893005,
0.45941051840782166,
-0.6985335946083069,
-1.0293426513671875,
-0.6728088855743408,
0.21881222724914... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7 | open-llm-leaderboard | 2023-09-04T02:39:23Z | 70 | 0 | null | [
"region:us"
] | 2023-09-04T02:39:23Z | 2023-09-04T02:38:25.000Z | 2023-09-04T02:38:25 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6832397060915553,\n\
\ \"acc_stderr\": 0.031693477754770626,\n \"acc_norm\": 0.6869592578044069,\n\
\ \"acc_norm_stderr\": 0.03166529474407705,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n\
\ \"mc2_stderr\": 0.01502146266727205\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.01368814730972912,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6818362875921131,\n\
\ \"acc_stderr\": 0.004648115322328777,\n \"acc_norm\": 0.873132842063334,\n\
\ \"acc_norm_stderr\": 0.0033214390244115494\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534422,\n\
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534422\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853113,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853113\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6948717948717948,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.6948717948717948,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.02132833757080438,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.02132833757080438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878456,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878456\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5910614525139665,\n\
\ \"acc_stderr\": 0.016442830654715548,\n \"acc_norm\": 0.5910614525139665,\n\
\ \"acc_norm_stderr\": 0.016442830654715548\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398195,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398195\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.022152889927898968,\n\
\ \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.022152889927898968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986214,\n \"mc2\": 0.6310264033909807,\n\
\ \"mc2_stderr\": 0.01502146266727205\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T02:38:01.038212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T02:38:01.038212.parquet'
- config_name: results
data_files:
- split: 2023_09_04T02_38_01.038212
path:
- results_2023-09-04T02:38:01.038212.parquet
- split: latest
path:
- results_2023-09-04T02:38:01.038212.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v7](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-04T02:38:01.038212](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v7/blob/main/results_2023-09-04T02%3A38%3A01.038212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6832397060915553,
"acc_stderr": 0.031693477754770626,
"acc_norm": 0.6869592578044069,
"acc_norm_stderr": 0.03166529474407705,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986214,
"mc2": 0.6310264033909807,
"mc2_stderr": 0.01502146266727205
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.01368814730972912,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725227
},
"harness|hellaswag|10": {
"acc": 0.6818362875921131,
"acc_stderr": 0.004648115322328777,
"acc_norm": 0.873132842063334,
"acc_norm_stderr": 0.0033214390244115494
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534422,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534422
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853113,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853113
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6948717948717948,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.6948717948717948,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.02132833757080438,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.02132833757080438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878456,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878456
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5910614525139665,
"acc_stderr": 0.016442830654715548,
"acc_norm": 0.5910614525139665,
"acc_norm_stderr": 0.016442830654715548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398195,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398195
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8024691358024691,
"acc_stderr": 0.022152889927898968,
"acc_norm": 0.8024691358024691,
"acc_norm_stderr": 0.022152889927898968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5430247718383312,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.5430247718383312,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986214,
"mc2": 0.6310264033909807,
"mc2_stderr": 0.01502146266727205
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.722307562828064,
-0.8052704334259033,
0.2947011888027191,
0.18630746006965637,
-0.22024725377559662,
-0.04643131420016289,
0.0004534001345746219,
-0.27380862832069397,
0.583423912525177,
-0.03588652238249779,
-0.525260329246521,
-0.7115074396133423,
-0.4582767188549042,
0.23147174715995... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6 | open-llm-leaderboard | 2023-09-04T18:45:26Z | 70 | 0 | null | [
"region:us"
] | 2023-09-04T18:45:26Z | 2023-09-04T18:44:27.000Z | 2023-09-04T18:44:27 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-04T18:44:04.025476](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6/blob/main/results_2023-09-04T18%3A44%3A04.025476.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6807030718501101,\n\
\ \"acc_stderr\": 0.03172804575089014,\n \"acc_norm\": 0.6844122287414298,\n\
\ \"acc_norm_stderr\": 0.03169926697255189,\n \"mc1\": 0.43818849449204406,\n\
\ \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.624345044166297,\n\
\ \"mc2_stderr\": 0.014991862964877591\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6774744027303754,\n \"acc_stderr\": 0.013659980894277373,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520767\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6856203943437562,\n\
\ \"acc_stderr\": 0.004633194825793846,\n \"acc_norm\": 0.8720374427404899,\n\
\ \"acc_norm_stderr\": 0.003333654120593691\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.03141082197596241,\n\
\ \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.03141082197596241\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4603174603174603,\n \"acc_stderr\": 0.02567008063690919,\n \"\
acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.02567008063690919\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267833,\n \"\
acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267833\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334332,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334332\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.023400928918310495,\n\
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.023400928918310495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n\
\ \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n\
\ \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n\
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867468,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867468\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8607918263090677,\n\
\ \"acc_stderr\": 0.012378786101885147,\n \"acc_norm\": 0.8607918263090677,\n\
\ \"acc_norm_stderr\": 0.012378786101885147\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6424581005586593,\n\
\ \"acc_stderr\": 0.01602939447489489,\n \"acc_norm\": 0.6424581005586593,\n\
\ \"acc_norm_stderr\": 0.01602939447489489\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7491961414790996,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.7491961414790996,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.022152889927898965,\n\
\ \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.022152889927898965\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5671447196870926,\n\
\ \"acc_stderr\": 0.012654565234622862,\n \"acc_norm\": 0.5671447196870926,\n\
\ \"acc_norm_stderr\": 0.012654565234622862\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031218,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031218\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7352941176470589,\n \"acc_stderr\": 0.01784808957491323,\n \
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.01784808957491323\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43818849449204406,\n\
\ \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.624345044166297,\n\
\ \"mc2_stderr\": 0.014991862964877591\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|arc:challenge|25_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hellaswag|10_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T18:44:04.025476.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T18:44:04.025476.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T18:44:04.025476.parquet'
- config_name: results
data_files:
- split: 2023_09_04T18_44_04.025476
path:
- results_2023-09-04T18:44:04.025476.parquet
- split: latest
path:
- results_2023-09-04T18:44:04.025476.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v6
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-04T18:44:04.025476](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v6/blob/main/results_2023-09-04T18%3A44%3A04.025476.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6807030718501101,
"acc_stderr": 0.03172804575089014,
"acc_norm": 0.6844122287414298,
"acc_norm_stderr": 0.03169926697255189,
"mc1": 0.43818849449204406,
"mc1_stderr": 0.017369236164404445,
"mc2": 0.624345044166297,
"mc2_stderr": 0.014991862964877591
},
"harness|arc:challenge|25": {
"acc": 0.6774744027303754,
"acc_stderr": 0.013659980894277373,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520767
},
"harness|hellaswag|10": {
"acc": 0.6856203943437562,
"acc_stderr": 0.004633194825793846,
"acc_norm": 0.8720374427404899,
"acc_norm_stderr": 0.003333654120593691
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.02567008063690919,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.02567008063690919
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334332,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.023400928918310495,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.023400928918310495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462469,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867468,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867468
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8607918263090677,
"acc_stderr": 0.012378786101885147,
"acc_norm": 0.8607918263090677,
"acc_norm_stderr": 0.012378786101885147
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6424581005586593,
"acc_stderr": 0.01602939447489489,
"acc_norm": 0.6424581005586593,
"acc_norm_stderr": 0.01602939447489489
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7491961414790996,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.7491961414790996,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8024691358024691,
"acc_stderr": 0.022152889927898965,
"acc_norm": 0.8024691358024691,
"acc_norm_stderr": 0.022152889927898965
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5671447196870926,
"acc_stderr": 0.012654565234622862,
"acc_norm": 0.5671447196870926,
"acc_norm_stderr": 0.012654565234622862
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031218,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031218
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.01784808957491323,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.01784808957491323
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43818849449204406,
"mc1_stderr": 0.017369236164404445,
"mc2": 0.624345044166297,
"mc2_stderr": 0.014991862964877591
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7158265709877014,
-0.82305908203125,
0.28889021277427673,
0.18191803991794586,
-0.20603744685649872,
-0.04362862557172775,
0.005441740155220032,
-0.26858341693878174,
0.5840696096420288,
-0.053954966366291046,
-0.539298951625824,
-0.6906648874282837,
-0.4315422475337982,
0.2210682183504... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
rombodawg/LimitlessCodeTraining_Guanaco_Format | rombodawg | 2023-09-29T04:23:00Z | 70 | 0 | null | [
"license:mit",
"region:us"
] | 2023-09-29T04:23:00Z | 2023-09-10T06:05:33.000Z | 2023-09-10T06:05:33 | ---
license: mit
---
This is the LimitlessCodeTraining dataset converted to guanaco format. Enjoy
Original model card:
_________________
----- BREAK THROUGH YOUR LIMITS -----
_________________

LimitlessCodeTraining is the direct sequal to Megacodetraining that is now called Legacy_MegaCodeTraining200k.
This dataset is just over 646k lines of pure refined coding data.
It is the pinacle of open source code training. It is the combination of the filtered Megacode training dataset filtered by shahules786 (shoutout to him) and the bigcode commitpackft dataset I converted to alpaca format.
The dataset that were used to create this dataset are linked bellow:
- https://huggingface.co/datasets/rombodawg/Rombodawgs_commitpackft_Evolinstruct_Converted
- https://huggingface.co/datasets/shahules786/megacode-best | [
-0.45226508378982544,
-0.33044081926345825,
0.14180892705917358,
0.20362690091133118,
-0.5915623903274536,
-0.44445061683654785,
-0.11331121623516083,
-0.27421700954437256,
0.3668646514415741,
0.8452379703521729,
-0.8132856488227844,
-0.5417417287826538,
-0.512467622756958,
0.0047228643670... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_lloorree__jfdslijsijdgis | open-llm-leaderboard | 2023-09-17T00:36:07Z | 70 | 0 | null | [
"region:us"
] | 2023-09-17T00:36:07Z | 2023-09-15T09:43:38.000Z | 2023-09-15T09:43:38 | ---
pretty_name: Evaluation run of lloorree/jfdslijsijdgis
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/jfdslijsijdgis](https://huggingface.co/lloorree/jfdslijsijdgis) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__jfdslijsijdgis\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-17T00:34:49.304226](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__jfdslijsijdgis/blob/main/results_2023-09-17T00-34-49.304226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6907933129316588,\n\
\ \"acc_stderr\": 0.03107455661224763,\n \"acc_norm\": 0.694824769775718,\n\
\ \"acc_norm_stderr\": 0.031044197474221744,\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5820460749080146,\n\
\ \"mc2_stderr\": 0.015030523772190541\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.01392100859517935,\n\
\ \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778764\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6760605457080263,\n\
\ \"acc_stderr\": 0.00467020812857923,\n \"acc_norm\": 0.8695478988249352,\n\
\ \"acc_norm_stderr\": 0.0033611183954523846\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262592,\n\
\ \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262592\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7153846153846154,\n \"acc_stderr\": 0.022878322799706304,\n\
\ \"acc_norm\": 0.7153846153846154,\n \"acc_norm_stderr\": 0.022878322799706304\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.026841514322958934,\n\
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.026841514322958934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8935779816513761,\n\
\ \"acc_stderr\": 0.013221554674594372,\n \"acc_norm\": 0.8935779816513761,\n\
\ \"acc_norm_stderr\": 0.013221554674594372\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n\
\ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802263,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517964,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517964\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.511731843575419,\n\
\ \"acc_stderr\": 0.016717897676932162,\n \"acc_norm\": 0.511731843575419,\n\
\ \"acc_norm_stderr\": 0.016717897676932162\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225188,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225188\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5397653194263363,\n\
\ \"acc_stderr\": 0.012729785386598545,\n \"acc_norm\": 0.5397653194263363,\n\
\ \"acc_norm_stderr\": 0.012729785386598545\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n\
\ \"acc_stderr\": 0.025000256039546195,\n \"acc_norm\": 0.8122448979591836,\n\
\ \"acc_norm_stderr\": 0.025000256039546195\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.0211662163046594,\n\
\ \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.0211662163046594\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n\
\ \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n\
\ \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n\
\ \"acc_stderr\": 0.02567934272327692,\n \"acc_norm\": 0.8713450292397661,\n\
\ \"acc_norm_stderr\": 0.02567934272327692\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n\
\ \"mc2\": 0.5820460749080146,\n \"mc2_stderr\": 0.015030523772190541\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lloorree/jfdslijsijdgis
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|arc:challenge|25_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|arc:challenge|25_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hellaswag|10_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hellaswag|10_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-17T00-34-49.304226.parquet'
- config_name: results
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- results_2023-09-15T09-43-22.432852.parquet
- split: 2023_09_17T00_34_49.304226
path:
- results_2023-09-17T00-34-49.304226.parquet
- split: latest
path:
- results_2023-09-17T00-34-49.304226.parquet
---
# Dataset Card for Evaluation run of lloorree/jfdslijsijdgis
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/jfdslijsijdgis
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/jfdslijsijdgis](https://huggingface.co/lloorree/jfdslijsijdgis) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__jfdslijsijdgis",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T00:34:49.304226](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__jfdslijsijdgis/blob/main/results_2023-09-17T00-34-49.304226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6907933129316588,
"acc_stderr": 0.03107455661224763,
"acc_norm": 0.694824769775718,
"acc_norm_stderr": 0.031044197474221744,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5820460749080146,
"mc2_stderr": 0.015030523772190541
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.01392100859517935,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.013438909184778764
},
"harness|hellaswag|10": {
"acc": 0.6760605457080263,
"acc_stderr": 0.00467020812857923,
"acc_norm": 0.8695478988249352,
"acc_norm_stderr": 0.0033611183954523846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262592,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262592
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7153846153846154,
"acc_stderr": 0.022878322799706304,
"acc_norm": 0.7153846153846154,
"acc_norm_stderr": 0.022878322799706304
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.026841514322958934,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.026841514322958934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8935779816513761,
"acc_stderr": 0.013221554674594372,
"acc_norm": 0.8935779816513761,
"acc_norm_stderr": 0.013221554674594372
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131695,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131695
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802263,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517964,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517964
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.511731843575419,
"acc_stderr": 0.016717897676932162,
"acc_norm": 0.511731843575419,
"acc_norm_stderr": 0.016717897676932162
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225188,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225188
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5397653194263363,
"acc_stderr": 0.012729785386598545,
"acc_norm": 0.5397653194263363,
"acc_norm_stderr": 0.012729785386598545
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546195,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546195
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.0211662163046594,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.0211662163046594
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5820460749080146,
"mc2_stderr": 0.015030523772190541
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7183644771575928,
-0.858361542224884,
0.2826015055179596,
0.23366199433803558,
-0.1369130164384842,
-0.040011946111917496,
-0.02216394804418087,
-0.2107238620519638,
0.5625770688056946,
-0.036376457661390305,
-0.4920283257961273,
-0.726423442363739,
-0.4699082672595978,
0.24690654873847... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_lloorree__kssht-dahj-70b | open-llm-leaderboard | 2023-09-18T23:52:21Z | 70 | 0 | null | [
"region:us"
] | 2023-09-18T23:52:21Z | 2023-09-18T23:51:21.000Z | 2023-09-18T23:51:21 | ---
pretty_name: Evaluation run of lloorree/kssht-dahj-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/kssht-dahj-70b](https://huggingface.co/lloorree/kssht-dahj-70b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-dahj-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T23:50:58.093131](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-dahj-70b/blob/main/results_2023-09-18T23-50-58.093131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7033014017574061,\n\
\ \"acc_stderr\": 0.03081446175839962,\n \"acc_norm\": 0.7072547203046122,\n\
\ \"acc_norm_stderr\": 0.03078306684205309,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5891645864509103,\n\
\ \"mc2_stderr\": 0.015115214729699759\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6612627986348123,\n \"acc_stderr\": 0.013830568927974332,\n\
\ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403515\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6867157936666003,\n\
\ \"acc_stderr\": 0.0046288092584835265,\n \"acc_norm\": 0.8730332603067118,\n\
\ \"acc_norm_stderr\": 0.003322552829608905\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677098,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677098\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528437,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528437\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7282051282051282,\n \"acc_stderr\": 0.02255655101013236,\n \
\ \"acc_norm\": 0.7282051282051282,\n \"acc_norm_stderr\": 0.02255655101013236\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.908256880733945,\n \"acc_stderr\": 0.012376323409137103,\n \"\
acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137103\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8735632183908046,\n\
\ \"acc_stderr\": 0.011884488905895538,\n \"acc_norm\": 0.8735632183908046,\n\
\ \"acc_norm_stderr\": 0.011884488905895538\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6458100558659218,\n\
\ \"acc_stderr\": 0.015995644947299225,\n \"acc_norm\": 0.6458100558659218,\n\
\ \"acc_norm_stderr\": 0.015995644947299225\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225184,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225184\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5612777053455019,\n\
\ \"acc_stderr\": 0.012673969883493268,\n \"acc_norm\": 0.5612777053455019,\n\
\ \"acc_norm_stderr\": 0.012673969883493268\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803403,\n \
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803403\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546195,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546195\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5891645864509103,\n\
\ \"mc2_stderr\": 0.015115214729699759\n }\n}\n```"
repo_url: https://huggingface.co/lloorree/kssht-dahj-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-50-58.093131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-50-58.093131.parquet'
- config_name: results
data_files:
- split: 2023_09_18T23_50_58.093131
path:
- results_2023-09-18T23-50-58.093131.parquet
- split: latest
path:
- results_2023-09-18T23-50-58.093131.parquet
---
# Dataset Card for Evaluation run of lloorree/kssht-dahj-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-dahj-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-dahj-70b](https://huggingface.co/lloorree/kssht-dahj-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-dahj-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T23:50:58.093131](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-dahj-70b/blob/main/results_2023-09-18T23-50-58.093131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7033014017574061,
"acc_stderr": 0.03081446175839962,
"acc_norm": 0.7072547203046122,
"acc_norm_stderr": 0.03078306684205309,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5891645864509103,
"mc2_stderr": 0.015115214729699759
},
"harness|arc:challenge|25": {
"acc": 0.6612627986348123,
"acc_stderr": 0.013830568927974332,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403515
},
"harness|hellaswag|10": {
"acc": 0.6867157936666003,
"acc_stderr": 0.0046288092584835265,
"acc_norm": 0.8730332603067118,
"acc_norm_stderr": 0.003322552829608905
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677098,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677098
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528437,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528437
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7282051282051282,
"acc_stderr": 0.02255655101013236,
"acc_norm": 0.7282051282051282,
"acc_norm_stderr": 0.02255655101013236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137103,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137103
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8735632183908046,
"acc_stderr": 0.011884488905895538,
"acc_norm": 0.8735632183908046,
"acc_norm_stderr": 0.011884488905895538
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6458100558659218,
"acc_stderr": 0.015995644947299225,
"acc_norm": 0.6458100558659218,
"acc_norm_stderr": 0.015995644947299225
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225184,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225184
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5612777053455019,
"acc_stderr": 0.012673969883493268,
"acc_norm": 0.5612777053455019,
"acc_norm_stderr": 0.012673969883493268
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803403,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546195,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546195
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5891645864509103,
"mc2_stderr": 0.015115214729699759
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7182550430297852,
-0.864639937877655,
0.28850308060646057,
0.20344175398349762,
-0.19289156794548035,
-0.04380204156041145,
0.014018797315657139,
-0.19743330776691437,
0.5682664513587952,
-0.03587634488940239,
-0.4885546863079071,
-0.732721209526062,
-0.4548064172267914,
0.2271215021610... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1 | open-llm-leaderboard | 2023-10-29T16:07:06Z | 70 | 0 | null | [
"region:us"
] | 2023-10-29T16:07:06Z | 2023-09-22T13:08:47.000Z | 2023-09-22T13:08:47 | ---
pretty_name: Evaluation run of Xwin-LM/Xwin-LM-70B-V0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xwin-LM/Xwin-LM-70B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T16:06:53.107330](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1/blob/main/results_2023-10-29T16-06-53.107330.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08504614093959731,\n\
\ \"em_stderr\": 0.0028567126231220596,\n \"f1\": 0.14911073825503293,\n\
\ \"f1_stderr\": 0.003010481134071011,\n \"acc\": 0.5504525862971696,\n\
\ \"acc_stderr\": 0.011424065665063533\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08504614093959731,\n \"em_stderr\": 0.0028567126231220596,\n\
\ \"f1\": 0.14911073825503293,\n \"f1_stderr\": 0.003010481134071011\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2721758908263836,\n \
\ \"acc_stderr\": 0.01225971403516454\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962526\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|arc:challenge|25_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T16_06_53.107330
path:
- '**/details_harness|drop|3_2023-10-29T16-06-53.107330.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T16-06-53.107330.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T16_06_53.107330
path:
- '**/details_harness|gsm8k|5_2023-10-29T16-06-53.107330.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T16-06-53.107330.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hellaswag|10_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T13-08-23.293621.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T13-08-23.293621.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T16_06_53.107330
path:
- '**/details_harness|winogrande|5_2023-10-29T16-06-53.107330.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T16-06-53.107330.parquet'
- config_name: results
data_files:
- split: 2023_09_22T13_08_23.293621
path:
- results_2023-09-22T13-08-23.293621.parquet
- split: 2023_10_29T16_06_53.107330
path:
- results_2023-10-29T16-06-53.107330.parquet
- split: latest
path:
- results_2023-10-29T16-06-53.107330.parquet
---
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-70B-V0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-70B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T16:06:53.107330](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-70B-V0.1/blob/main/results_2023-10-29T16-06-53.107330.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08504614093959731,
"em_stderr": 0.0028567126231220596,
"f1": 0.14911073825503293,
"f1_stderr": 0.003010481134071011,
"acc": 0.5504525862971696,
"acc_stderr": 0.011424065665063533
},
"harness|drop|3": {
"em": 0.08504614093959731,
"em_stderr": 0.0028567126231220596,
"f1": 0.14911073825503293,
"f1_stderr": 0.003010481134071011
},
"harness|gsm8k|5": {
"acc": 0.2721758908263836,
"acc_stderr": 0.01225971403516454
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962526
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.4229125678539276,
-0.5687999129295349,
0.31143903732299805,
0.13727445900440216,
-0.10180705785751343,
0.20579810440540314,
-0.4191685616970062,
-0.26943743228912354,
0.39288806915283203,
0.5427828431129456,
-0.8094091415405273,
-0.98947674036026,
-0.6483511924743652,
0.1093077063560485... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
rmanluo/RoG-cwq | rmanluo | 2023-10-01T23:47:36Z | 70 | 1 | null | [
"region:us"
] | 2023-10-01T23:47:36Z | 2023-10-01T23:29:54.000Z | 2023-10-01T23:29:54 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
sequence: string
- name: q_entity
sequence: string
- name: a_entity
sequence: string
- name: graph
sequence:
sequence: string
- name: choices
sequence: 'null'
splits:
- name: train
num_bytes: 8890766478
num_examples: 27639
- name: validation
num_bytes: 1170336525
num_examples: 3519
- name: test
num_bytes: 1208452620
num_examples: 3531
download_size: 1993772283
dataset_size: 11269555623
---
# Dataset Card for "RoG-cwq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | [
-0.43553677201271057,
-0.3198004961013794,
-0.0013387587387114763,
0.020879831165075302,
-0.26911938190460205,
0.11626376956701279,
0.3872847259044647,
-0.20392757654190063,
0.59748375415802,
0.5966494083404541,
-1.0246946811676025,
-0.8648484349250793,
-0.47201064229011536,
-0.34416139125... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.2.1 | open-llm-leaderboard | 2023-10-02T00:43:21Z | 70 | 0 | null | [
"region:us"
] | 2023-10-02T00:43:21Z | 2023-10-02T00:42:22.000Z | 2023-10-02T00:42:22 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-70b-2.2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-70b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-70b-2.2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.2.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-02T00:41:58.859949](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.2.1/blob/main/results_2023-10-02T00-41-58.859949.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6970834854186557,\n\
\ \"acc_stderr\": 0.031037204423526216,\n \"acc_norm\": 0.7009415944284378,\n\
\ \"acc_norm_stderr\": 0.03100649188026674,\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.5949086139726426,\n\
\ \"mc2_stderr\": 0.015268616864386245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n\
\ \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6936865166301533,\n\
\ \"acc_stderr\": 0.004600194559865541,\n \"acc_norm\": 0.8795060744871539,\n\
\ \"acc_norm_stderr\": 0.0032487292211528878\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123387,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123387\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.04644602091222318,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.04644602091222318\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8387096774193549,\n \"acc_stderr\": 0.0209233270064233,\n\
\ \"acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.0209233270064233\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343346,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343346\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078915,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078915\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7564102564102564,\n \"acc_stderr\": 0.021763733684173923,\n\
\ \"acc_norm\": 0.7564102564102564,\n \"acc_norm_stderr\": 0.021763733684173923\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.040428099613956346,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.040428099613956346\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958788,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958788\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813902,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813902\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795663,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967554,\n\
\ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5843575418994413,\n\
\ \"acc_stderr\": 0.016482782187500683,\n \"acc_norm\": 0.5843575418994413,\n\
\ \"acc_norm_stderr\": 0.016482782187500683\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\
\ \"acc_stderr\": 0.02440616209466889,\n \"acc_norm\": 0.7556270096463023,\n\
\ \"acc_norm_stderr\": 0.02440616209466889\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093865,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093865\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.02963483847376601,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.02963483847376601\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5534550195567145,\n\
\ \"acc_stderr\": 0.012697046024399656,\n \"acc_norm\": 0.5534550195567145,\n\
\ \"acc_norm_stderr\": 0.012697046024399656\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7434640522875817,\n \"acc_stderr\": 0.017667841612379005,\n \
\ \"acc_norm\": 0.7434640522875817,\n \"acc_norm_stderr\": 0.017667841612379005\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.025172984350155764,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.025172984350155764\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.5949086139726426,\n\
\ \"mc2_stderr\": 0.015268616864386245\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-70b-2.2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|arc:challenge|25_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hellaswag|10_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-02T00-41-58.859949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-02T00-41-58.859949.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-02T00-41-58.859949.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-02T00-41-58.859949.parquet'
- config_name: results
data_files:
- split: 2023_10_02T00_41_58.859949
path:
- results_2023-10-02T00-41-58.859949.parquet
- split: latest
path:
- results_2023-10-02T00-41-58.859949.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-2.2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-70b-2.2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-2.2.1](https://huggingface.co/jondurbin/airoboros-l2-70b-2.2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.2.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-02T00:41:58.859949](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.2.1/blob/main/results_2023-10-02T00-41-58.859949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6970834854186557,
"acc_stderr": 0.031037204423526216,
"acc_norm": 0.7009415944284378,
"acc_norm_stderr": 0.03100649188026674,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.5949086139726426,
"mc2_stderr": 0.015268616864386245
},
"harness|arc:challenge|25": {
"acc": 0.6552901023890785,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.6936865166301533,
"acc_stderr": 0.004600194559865541,
"acc_norm": 0.8795060744871539,
"acc_norm_stderr": 0.0032487292211528878
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.0209233270064233,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.0209233270064233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343346,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343346
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078915,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078915
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.021763733684173923,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.021763733684173923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.040428099613956346,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.040428099613956346
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958788,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958788
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813902,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813902
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786746,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786746
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795663,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967554,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5843575418994413,
"acc_stderr": 0.016482782187500683,
"acc_norm": 0.5843575418994413,
"acc_norm_stderr": 0.016482782187500683
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.02440616209466889,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.02440616209466889
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.020888690414093865,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.020888690414093865
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.02963483847376601,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.02963483847376601
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5534550195567145,
"acc_stderr": 0.012697046024399656,
"acc_norm": 0.5534550195567145,
"acc_norm_stderr": 0.012697046024399656
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7434640522875817,
"acc_stderr": 0.017667841612379005,
"acc_norm": 0.7434640522875817,
"acc_norm_stderr": 0.017667841612379005
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.025172984350155764,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.025172984350155764
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.5949086139726426,
"mc2_stderr": 0.015268616864386245
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7438439726829529,
-0.8538773655891418,
0.22480998933315277,
0.21996814012527466,
-0.15068085491657257,
-0.07252612709999084,
0.00781771820038557,
-0.22007811069488525,
0.5832584500312805,
-0.029923219233751297,
-0.48376667499542236,
-0.6770612597465515,
-0.46533823013305664,
0.243947297... | null | null | null | null | null | null | null | null | null | null | null | null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.