datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
NghiemAbe/ViNLI_4_triplet | ---
dataset_info:
features:
- name: anchor
dtype: string
- name: positive
dtype: string
- name: negative
dtype: string
splits:
- name: train
num_bytes: 1245912
num_examples: 3036
download_size: 724622
dataset_size: 1245912
---
# Dataset Card for "ViNLI_4_triplet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LoganLeis/SonicRivals | ---
license: openrail
---
|
McSpicyWithMilo/target-elements-0.3split | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: target_element
dtype: string
- name: instruction_type
dtype: string
splits:
- name: train
num_bytes: 31885.0
num_examples: 280
- name: test
num_bytes: 13665.0
num_examples: 120
download_size: 24258
dataset_size: 45550.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "target-elements-0.3split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/2b_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 2b/2B/2B/2B (Nikke: Goddess of Victory)
This is the dataset of 2b/2B/2B/2B (Nikke: Goddess of Victory), containing 500 images and their tags.
The core tags of this character are `short_hair, white_hair, hairband, breasts, mole, mole_under_mouth, black_hairband, medium_breasts, covered_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 704.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/2b_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 408.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/2b_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1125 | 788.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/2b_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 626.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/2b_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1125 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/2b_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/2b_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, blue_eyes, alternate_costume, black_skirt, looking_at_viewer, white_shirt, chair, collarbone, contemporary, full_body, no_blindfold, pantyhose, pencil_skirt, sitting |
| 1 | 27 |  |  |  |  |  | 1girl, black_blindfold, black_thighhighs, katana, solo, black_dress, cleavage_cutout, holding_sword, black_gloves, feather-trimmed_sleeves, thigh_boots, juliet_sleeves, leotard, black_footwear, high_heel_boots, thighhighs_under_boots |
| 2 | 7 |  |  |  |  |  | 1girl, black_blindfold, black_dress, black_gloves, black_thighhighs, cleavage_cutout, facing_viewer, feather-trimmed_sleeves, holding_sword, juliet_sleeves, katana, pink_lips, side_slit, solo, thigh_boots, turtleneck, vambraces, black_footwear, cowboy_shot, short_dress, closed_mouth, nose, thighhighs_under_boots, white_leotard, bangs, parted_lips, ribbon, standing |
| 3 | 9 |  |  |  |  |  | 1girl, black_blindfold, black_dress, cleavage_cutout, juliet_sleeves, solo, black_gloves, black_thighhighs, feather-trimmed_sleeves, boots |
| 4 | 6 |  |  |  |  |  | 1girl, black_blindfold, black_dress, cleavage_cutout, closed_mouth, facing_viewer, juliet_sleeves, solo, turtleneck, upper_body, pink_lips, simple_background, sword |
| 5 | 6 |  |  |  |  |  | 1girl, black_dress, blue_eyes, cleavage_cutout, juliet_sleeves, no_blindfold, solo, upper_body, closed_mouth, turtleneck, hair_between_eyes, lips, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blue_eyes | alternate_costume | black_skirt | looking_at_viewer | white_shirt | chair | collarbone | contemporary | full_body | no_blindfold | pantyhose | pencil_skirt | sitting | black_blindfold | black_thighhighs | katana | black_dress | cleavage_cutout | holding_sword | black_gloves | feather-trimmed_sleeves | thigh_boots | juliet_sleeves | leotard | black_footwear | high_heel_boots | thighhighs_under_boots | facing_viewer | pink_lips | side_slit | turtleneck | vambraces | cowboy_shot | short_dress | closed_mouth | nose | white_leotard | bangs | parted_lips | ribbon | standing | boots | upper_body | simple_background | sword | hair_between_eyes | lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:--------------------|:--------------|:--------------------|:--------------|:--------|:-------------|:---------------|:------------|:---------------|:------------|:---------------|:----------|:------------------|:-------------------|:---------|:--------------|:------------------|:----------------|:---------------|:--------------------------|:--------------|:-----------------|:----------|:-----------------|:------------------|:-------------------------|:----------------|:------------|:------------|:-------------|:------------|:--------------|:--------------|:---------------|:-------|:----------------|:--------|:--------------|:---------|:-----------|:--------|:-------------|:--------------------|:--------|:--------------------|:-------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 27 |  |  |  |  |  | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | | | | | | | | | | | | | X | X | | X | X | | X | X | | X | | | | | | | | | | | | | | | | | | | X | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | | | | | | | | | | | | | X | | | X | X | | | | | X | | | | | X | X | | X | | | | X | | | | | | | | X | X | X | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | | X | | | | | | X | | | | | | | X | X | | | | | X | | | | | | | | X | | | | X | | | | | | | | X | | | X | X |
|
nateraw/auto-cats-and-dogs |
---
task_categories:
- other
task_ids:
- other-image-classification
- image-classification
tags:
- auto-generated
- image-classification
---
# nateraw/auto-cats-and-dogs
Image Classification Dataset
## Usage
```python
from PIL import Image
from datasets import load_dataset
def pil_loader(path: str):
with open(path, 'rb') as f:
im = Image.open(f)
return im.convert('RGB')
def image_loader(example_batch):
example_batch['image'] = [
pil_loader(f) for f in example_batch['file']
]
return example_batch
ds = load_dataset('nateraw/auto-cats-and-dogs')
ds = ds.with_transform(image_loader)
```
|
Francesco/apples-fvpl5 | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': apples
'1': apple
'2': damaged_apple
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: apples-fvpl5
tags:
- rf100
---
# Dataset Card for apples-fvpl5
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/apples-fvpl5
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
apples-fvpl5
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/apples-fvpl5
### Citation Information
```
@misc{ apples-fvpl5,
title = { apples fvpl5 Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/apples-fvpl5 } },
url = { https://universe.roboflow.com/object-detection/apples-fvpl5 },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
robinsmits/ChatAlpaca-20K | ---
language:
- en
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- conversational
- text-generation
pretty_name: ChatAlpaca 20K
tags:
- chat
- alpaca
- gpt-3.5
dataset_info:
features:
- name: id
dtype: int32
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: original
num_bytes: 68350442
num_examples: 20000
- name: train
num_bytes: 66983433.16
num_examples: 19600
- name: test
num_bytes: 1367008.84
num_examples: 400
download_size: 69122205
dataset_size: 136700884.0
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for ChatAlpaca 20K
## ChatAlpaca: A Multi-Turn Dialogue Corpus based on Alpaca Instructions
### Dataset Description
ChatAlpaca is a chat dataset that aims to help researchers develop models for instruction-following in multi-turn conversations. The dataset is an extension of the [Stanford Alpaca data](https://github.com/tatsu-lab/stanford_alpaca), which contains multi-turn instructions and their corresponding responses.
ChatAlpaca is developed by Chinese Information Processing Laboratory at the Institute of Software, Chinese Academy of Sciences (www.icip.org.cn).
In this dataset, we use ChatGPT (GPT-3.5-turbo) to generate follow-up utterances and continue the conversation with ChatGPT. This process results in multi-turn conversations where the simulated user provides instructions and ChatGPT responds accordingly.
### Dataset Structure
The data in this repository consists of a JSON file chatalpaca-20k.json containing the multi-turn instruction-following conversations. Each item in the data file represents a different conversation between a simulated user and the ChatGPT. The data currently contains a total of 20,000 conversations.
I've created 3 splits in the dataset:
* original: all the data as available in the original chatalpaca-20k.json
* train: 19600 rows for training
* test: 400 rows for testing or validating.
*
The conversations in the dataset begin with an instruction, followed by a response from ChatGPT, which are from the Stanford Alpaca data. The conversation then continues with the user providing further instructions and ChatGPT responding accordingly.
Each item in the dataset is a list of dictionaries, each dictionary contains the following fields:
* id: int, identifier
* messages: list, contains multiple dictionaries representing the conversations
Each dictionary in the messages list contains the following fields:
* role: str, with values "user" or "assistant";
* content: str, the conversation content.
! Note: In the original .json file the researchers used different field and variable names.
I've renamed them to be consistent with the defacto standard as used for chat conversations.
Here is an example of one conversation:
```
[
{
"role": "user",
"content": "Identify the odd one out: Twitter, Instagram, Telegram"
},
{
"role": "assistant",
"content": "Telegram"
},
{
"role": "user",
"content": "What makes Telegram different from Twitter and Instagram?"
},
{
"role": "assistant",
"content": "Telegram is a cloud-based instant messaging app that focuses on privacy and security. Unlike Twitter and Instagram which are mainly used for following news, celebrities, and sharing images, Telegram was created as a secure messaging app for private and group communication. Telegram also offers more advanced features than Twitter and Instagram, such as the ability to schedule messages, create bots, and send encrypted messages."
},
{
"role": "user",
"content": "Can you give me an example of how the scheduling messages feature can be useful on Telegram?"
},
{
"role": "assistant",
"content": "Sure! The scheduling messages feature on Telegram can be useful in many different situations. Here's an example:\n\nLet's say you have a workgroup on Telegram, and you want to send a message to everyone in the group to remind them of a meeting. However, you know that most of the people in the group are busy at work during the time you want to send the message.\n\nThat's where the scheduling messages feature comes in. You can schedule the message to be sent at a later time when people are more likely to be free, such as during their lunch break or after work. This way, everyone in the group will receive the message at the same time, and you won't have to worry about anyone missing the important reminder.\n\nOverall, the scheduling messages feature can be really helpful for coordinating communication with groups of people on Telegram, whether it's for work, social events, or anything else."
},
{
"role": "user",
"content": "Goodbye."
}
]
```
### Discussion of Biases
As this dataset contains machine-generated texts it could contain potential biases. Verify and validate this for your specific usage scenario.
## Citation
!! Note. I've only put this dataset on HuggingFace since it was not available. All credits go to the original researchers and creators.
Please cite the original repository if you use the dataset.
```
@misc{ChatAlpaca,
author = {Ning Bian and Hongyu Lin and Yaojie Lu and Xianpei Han and Le Sun and Ben He },
title = {ChatAlpaca: A Multi-Turn Dialogue Corpus based on Alpaca Instructions},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/cascip/ChatAlpaca}},
}
``` |
athirdpath/DPO_Pairs-Roleplay-Alpaca-NSFW-v1-SHUFFLED | ---
license: cc-by-nc-4.0
language:
- en
tags:
- not-for-all-audiences
---
### Description
\~3.4k DPO pairs, generated by [Iambe](https://huggingface.co/athirdpath/Iambe-20b-DARE-v2-GGUF) feat. GPT-4 (~10% GPT-4, ~80% Iambe @ q5_k_m / ~10% Iambe @ q6_k) with temp 1.2 and min_p 0.15.
They are shuffled this time, as I was not aware that TRL did not do that automatically until I could see the shifts in the dataset mirrored in the loss patterns.
Iambe is a smart girl, so both the chosen and rejected for each pair are generated at the same time from a single two part prompt (not the one in the dataset). Only a few dozen failed to generate the rejected response, and in those cases I filled in the rejected output with a standard "as an AI" style refusal. The way I set things up caused any prompt formatting errors to automatically go into the REJECTED field, hopefully discouraging such behavior.
The dataset is mostly intended for ERP, so erotic situations are an outsized portion of the dataset. However, it isn't all sexual, with sport, adventure, etc content to provide a baseline.
### Downsides and Weaknesses
This dataset has not been manually cleaned, besides some basic search-and-replace. This dataset has been overseen in bursts (such content passed muster or the entire batch was rejected) but much of the content has yet to be read by a human. Some chosen responses have the AI acting for the user, I've removed what I've seen but this is a lot to manually filter. Despite my best efforts to prompt away from it, some rejected responses are... what seems to be... let's say, in a certain voice? I really tried to avoid it, and most of it was removed.
### Goals
This dataset is intended to be used to produce a BETTER Iambe, that can itself produce even more data of higher quality. Bootstraps to the moon, baby! So, while this dataset still has rough points, I feel like it's worth uploading and starting to train the model. I also want to combine data generated by more models, as to not just amplify what is already Iambe's signal.
### Bonus
I've also added the notebook I used to train [athirdpath/Iambe-20b-v3_TEST-RP_cDPO](https://huggingface.co/athirdpath/Iambe-20b-v3_TEST-RP_cDPO) on runpod, adapted from a Kaggle by [@maximelabonne](https://twitter.com/maximelabonne) (thank you!) |
eitanturok/glaive-function-calling-v2-sharegpt-keep-1000-5 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 16413904
num_examples: 5000
download_size: 7490555
dataset_size: 16413904
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Spico/Humback | ---
license: apache-2.0
---
|
betotdai/minhavoz | ---
license: apache-2.0
---
|
Nexdata/Japanese_Speech_Data_By_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Japanese_Speech_Data_By_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/947?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
474 Hours-Japanese Speech Data By Mobile Phone were recorded by 1,245 local Japanese speakers with authentic accents; the recorded texts cover general, interactive, car, home and other categories, and are rich in content; the text of the voice data collected by this set of Japanese mobile phones has been manually proofread. High accuracy; match mainstream Android and Apple mobile phones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/947?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Japanese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
whu9/billsum_postprocess | ---
dataset_info:
features:
- name: source
dtype: string
- name: summary
dtype: string
- name: source_num_tokens
dtype: int64
- name: summary_num_tokens
dtype: int64
splits:
- name: train
num_bytes: 217576274
num_examples: 18949
- name: test
num_bytes: 37517829
num_examples: 3269
- name: ca_test
num_bytes: 14715227
num_examples: 1234
download_size: 112581904
dataset_size: 269809330
---
# Dataset Card for "billsum_postprocess"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Coooori/dialog_data_train_hf | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1375302
num_examples: 801
download_size: 729647
dataset_size: 1375302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dialog_data_train_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pythainlp/thai_usembassy_th2en_prompt | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4932896
num_examples: 615
download_size: 1969489
dataset_size: 4932896
license: cc0-1.0
task_categories:
- text2text-generation
- text-generation
language:
- th
size_categories:
- n<1K
---
# Dataset Card for "thai_usembassy_th2en_prompt"
This dataset made from [pythainlp/thai_usembassy](https://huggingface.co/datasets/pythainlp/thai_usembassy).
Source code for create dataset: [https://github.com/PyThaiNLP/support-aya-datasets/blob/main/translation/thai_usembassy.ipynb](https://github.com/PyThaiNLP/support-aya-datasets/blob/main/translation/thai_usembassy.ipynb)
## Template
```
Inputs: แปลประโยคหรือย่อหน้าต่อไปนี้จากภาษาไทยเป็นภาษาอังกฤษ:\n{th}
Targets: English sentence
``` |
rajaneeshr/deta | ---
license: gpl-3.0
---
|
yejeekang/legal_instruction_token-1200 | ---
license: afl-3.0
---
|
Fung804/makoto-shinkai-picture | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1012000102.307
num_examples: 1347
download_size: 1044953186
dataset_size: 1012000102.307
---
# Dataset Card for "makoto-shinkai-picture"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-slerp | ---
pretty_name: Evaluation run of louisbrulenaudet/Pearl-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [louisbrulenaudet/Pearl-7B-slerp](https://huggingface.co/louisbrulenaudet/Pearl-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T21:12:46.368604](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-slerp/blob/main/results_2024-02-09T21-12-46.368604.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.64584950068684,\n\
\ \"acc_stderr\": 0.031939101516795736,\n \"acc_norm\": 0.645007297068897,\n\
\ \"acc_norm_stderr\": 0.03260424530890984,\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6235249500537349,\n\
\ \"mc2_stderr\": 0.01536713766315419\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6561433447098977,\n \"acc_stderr\": 0.013880644570156215,\n\
\ \"acc_norm\": 0.6800341296928327,\n \"acc_norm_stderr\": 0.013631345807016195\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6878111929894444,\n\
\ \"acc_stderr\": 0.004624393690966905,\n \"acc_norm\": 0.8716391157140012,\n\
\ \"acc_norm_stderr\": 0.0033380760156172633\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n\
\ \"acc_stderr\": 0.024857478080250458,\n \"acc_norm\": 0.8529411764705882,\n\
\ \"acc_norm_stderr\": 0.024857478080250458\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n\
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.01623282681867849,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.01623282681867849\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.02512263760881666,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.02512263760881666\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135118,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135118\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6235249500537349,\n\
\ \"mc2_stderr\": 0.01536713766315419\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242914\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7361637604245641,\n \
\ \"acc_stderr\": 0.0121393864251268\n }\n}\n```"
repo_url: https://huggingface.co/louisbrulenaudet/Pearl-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-12-46.368604.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T21-12-46.368604.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- '**/details_harness|winogrande|5_2024-02-09T21-12-46.368604.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T21-12-46.368604.parquet'
- config_name: results
data_files:
- split: 2024_02_09T21_12_46.368604
path:
- results_2024-02-09T21-12-46.368604.parquet
- split: latest
path:
- results_2024-02-09T21-12-46.368604.parquet
---
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-slerp](https://huggingface.co/louisbrulenaudet/Pearl-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T21:12:46.368604](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-slerp/blob/main/results_2024-02-09T21-12-46.368604.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.64584950068684,
"acc_stderr": 0.031939101516795736,
"acc_norm": 0.645007297068897,
"acc_norm_stderr": 0.03260424530890984,
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6235249500537349,
"mc2_stderr": 0.01536713766315419
},
"harness|arc:challenge|25": {
"acc": 0.6561433447098977,
"acc_stderr": 0.013880644570156215,
"acc_norm": 0.6800341296928327,
"acc_norm_stderr": 0.013631345807016195
},
"harness|hellaswag|10": {
"acc": 0.6878111929894444,
"acc_stderr": 0.004624393690966905,
"acc_norm": 0.8716391157140012,
"acc_norm_stderr": 0.0033380760156172633
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250458
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.01623282681867849,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.01623282681867849
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.02512263760881666,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.02512263760881666
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135118,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135118
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6235249500537349,
"mc2_stderr": 0.01536713766315419
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242914
},
"harness|gsm8k|5": {
"acc": 0.7361637604245641,
"acc_stderr": 0.0121393864251268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yawoayite/qg-article-context-question | ---
dataset_info:
features:
- name: Articles
dtype: string
- name: Contextes
dtype: string
- name: Questions
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 462783
num_examples: 424
download_size: 72683
dataset_size: 462783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qg-article-context-question"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MoritzLaurer/test-dataset-via-streamlit | ---
dataset_info:
features:
- name: english
dtype: string
- name: label
dtype:
class_label:
names:
'0': label1
'1': label2
splits:
- name: train
num_bytes: 1218.0
num_examples: 3
- name: test
num_bytes: 159
num_examples: 1
download_size: 7754
dataset_size: 1377.0
---
# Dataset Card for "test-dataset-via-streamlit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anlp/annotation2_wo_elimination | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: sentences
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 1326274
num_examples: 3384
download_size: 0
dataset_size: 1326274
---
# Dataset Card for "annotation2_wo_elimination"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/LitCovid_1shot_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 208527771
num_examples: 24960
- name: valid
num_bytes: 52290717
num_examples: 6239
- name: test
num_bytes: 21392053
num_examples: 2500
download_size: 76101058
dataset_size: 282210541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Langame/starter | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- expert-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: ''
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-generation
task_ids: []
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
adibpriatama/rf_online_robust_lava | ---
license: mit
---
|
pintileipetru/autotrain-data-language_model | ---
task_categories:
- translation
---
# AutoTrain Dataset for project: language_model
## Dataset Description
This dataset has been automatically processed by AutoTrain for project language_model.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"source": "Aceste onsule sunt denumite Teritoriile de Nord.",
"target": "Aceste insule sunt denumite Teritoriile de Nord."
},
{
"source": "Care este pozi\u0163ia noastr\u0103dde plecare?",
"target": "Care este pozi\u0163ia noastr\u0103 de plecare?"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"source": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2398 |
| valid | 600 |
|
symanto/autextification2023 | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
language:
- en
- es
pretty_name: AuTexTification 2023
size_categories:
- 10K<n<100K
source_datasets:
- multi_eurlex
- xsum
- csebuetnlp/xlsum
- mlsum
- amazon_polarity
- https://sinai.ujaen.es/investigacion/recursos/coah
- https://sinai.ujaen.es/investigacion/recursos/coar
- carblacac/twitter-sentiment-analysis
- cardiffnlp/tweet_sentiment_multilingual
- https://www.kaggle.com/datasets/ricardomoya/tweets-poltica-espaa
- wiki_lingua
---
# Dataset Card for AuTexTification 2023
## Dataset Description
- **Homepage:** https://sites.google.com/view/autextification
- **Repository:** https://github.com/autextification/AuTexTification-Overview
- **Paper:** https://arxiv.org/abs/2309.11285
### Dataset Summary
AuTexTification 2023 @IberLEF2023 is a shared task focusing in Machine-Generated Text Detection and Model Attribution in English and Spanish.
The dataset includes human and generated text in 5 domains: tweets, reviews, how-to articles, news, and legal documents.
The generations are obtained using six language models: BLOOM-1B1, BLOOM-3B, BLOOM-7B1, Babbage, Curie, and text-davinci-003.
For more information, please refer to our overview paper: https://arxiv.org/abs/2309.11285
### Supported Tasks and Leaderboards
- Machine-Generated Text Detection
- Model Attribution
### Languages
English and Spanish
## Dataset Structure
### Data Instances
163k instances of labeled text in total.
### Data Fields
For MGT Detection:
- id
- prompt
- text
- label
- model
- domain
For Model Attribution:
- id
- prompt
- text
- label
- domain
### Data Splits
- MGT Detection Data:
| Language | Split | Human | Generated | Total |
| -------- | ----- | ------ | --------- | ------ |
| English | Train | 17.046 | 16.799 | 33.845 |
| | Test | 10.642 | 11.190 | 21.832 |
| | Total | 27.688 | 27.989 | |
| Spanish | Train | 15.787 | 16.275 | 32.062 |
| | Test | 11.209 | 8.920 | 20.129 |
| | Total | 26.996 | 25.195 | |
- Model Attribution Data:
| | | BLOOM | | | GPT | | | |
| -------- | ----- | ----- | ----- | ----- | ------- | ----- | ---------------- | ------ |
| Language | Split | 1B7 | 3B | 7B | babbage | curie | text-davinci-003 | Total |
| English | Train | 3.562 | 3.648 | 3.687 | 3.870 | 3.822 | 3.827 | 14.767 |
| | Test | 887 | 875 | 952 | 924 | 979 | 988 | 3.638 |
| | Total | 4.449 | 4.523 | 4.639 | 4.794 | 4.801 | 4.815 | |
| Spanish | Train | 3.422 | 3.514 | 3.575 | 3.788 | 3.770 | 3.866 | 14.299 |
| | Test | 870 | 867 | 878 | 946 | 1.004 | 917 | 3.561 |
| | Total | 4.292 | 4.381 | 4.453 | 4.734 | 4.774 | 4.783 | |
## Dataset Creation
### Curation Rationale
Human data was gathered and used to prompt language models, obtaining generated data.
Specific decisions were made to ensure the data gathering process was carried out in an unbiased manner, making the final human and generated texts probable continuations of a given prefix.
For more detailed information, please refer to the overview paper: https://arxiv.org/abs/2309.11285
### Source Data
The following datasets were used as human text:
- multi_eurlex
- xsum
- csebuetnlp/xlsum
- mlsum
- amazon_polarity
- https://sinai.ujaen.es/investigacion/recursos/coah
- https://sinai.ujaen.es/investigacion/recursos/coar
- carblacac/twitter-sentiment-analysis
- cardiffnlp/tweet_sentiment_multilingual
- https://www.kaggle.com/datasets/ricardomoya/tweets-poltica-espaa
- wiki_lingua
These datasets were only used as sources of human text. The labels of the datasets were not employed in any manner.
### Licensing Information
CC-BY-NC-SA-4.0
### Citation Information
```
@inproceedings{autextification2023,
title = "Overview of AuTexTification at IberLEF 2023: Detection and Attribution of Machine-Generated Text in Multiple Domains",
author = "Sarvazyan, Areg Mikael and
Gonz{\'a}lez, Jos{\'e} {\'A}ngel and
Franco-Salvador, Marc and
Rangel, Francisco and
Chulvi, Berta and
Rosso, Paolo",
month = sep,
year = "2023",
address = "Jaén, Spain",
booktitle = "Procesamiento del Lenguaje Natural",
}
``` |
heliosprime/twitter_dataset_1713216212 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22683
num_examples: 66
download_size: 20019
dataset_size: 22683
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713216212"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-econometrics-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 63974
num_examples: 114
download_size: 34157
dataset_size: 63974
---
# Dataset Card for "mmlu-econometrics-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v4_test_700000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 6742140.3
num_examples: 18000
- name: test
num_bytes: 749126.7
num_examples: 2000
download_size: 3216875
dataset_size: 7491267.0
---
# Dataset Card for "final_train_v4_test_700000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_BQA | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 19213713
num_examples: 4725
download_size: 8423267
dataset_size: 19213713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jotasax/carnalito | ---
license: openrail
---
|
open-llm-leaderboard/details_croissantllm__CroissantCool-v0.2 | ---
pretty_name: Evaluation run of croissantllm/CroissantCool-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [croissantllm/CroissantCool-v0.2](https://huggingface.co/croissantllm/CroissantCool-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_croissantllm__CroissantCool-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T00:48:36.907301](https://huggingface.co/datasets/open-llm-leaderboard/details_croissantllm__CroissantCool-v0.2/blob/main/results_2024-04-03T00-48-36.907301.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2501316710730105,\n\
\ \"acc_stderr\": 0.030599339857153456,\n \"acc_norm\": 0.25175123828516227,\n\
\ \"acc_norm_stderr\": 0.03140882407248125,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826844,\n \"mc2\": 0.39340977778528863,\n\
\ \"mc2_stderr\": 0.014574937058003997\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28242320819112626,\n \"acc_stderr\": 0.013155456884097225,\n\
\ \"acc_norm\": 0.318259385665529,\n \"acc_norm_stderr\": 0.013611993916971453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42202748456482775,\n\
\ \"acc_stderr\": 0.004928735103635842,\n \"acc_norm\": 0.545807608046206,\n\
\ \"acc_norm_stderr\": 0.00496879680041041\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.026199808807561915,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.026199808807561915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02924188386962883,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02924188386962883\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217893,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924318,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924318\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.022755204959542936,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.022755204959542936\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n\
\ \"acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117447,\n\
\ \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222724,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222724\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833713,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833713\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279472,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.032162984205936135,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.032162984205936135\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1944954128440367,\n \"acc_stderr\": 0.016970289090458064,\n \"\
acc_norm\": 0.1944954128440367,\n \"acc_norm_stderr\": 0.016970289090458064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491652,\n \"\
acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491652\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.19117647058823528,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352167,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352167\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.02905858830374884,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.02905858830374884\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n\
\ \"acc_stderr\": 0.015517322365529619,\n \"acc_norm\": 0.2515964240102171,\n\
\ \"acc_norm_stderr\": 0.015517322365529619\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n\
\ \"acc_stderr\": 0.014054314935614556,\n \"acc_norm\": 0.22905027932960895,\n\
\ \"acc_norm_stderr\": 0.014054314935614556\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23728813559322035,\n\
\ \"acc_stderr\": 0.010865436690780276,\n \"acc_norm\": 0.23728813559322035,\n\
\ \"acc_norm_stderr\": 0.010865436690780276\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \"\
acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225392,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225392\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\
\ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\
\ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826844,\n \"mc2\": 0.39340977778528863,\n\
\ \"mc2_stderr\": 0.014574937058003997\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5643251775848461,\n \"acc_stderr\": 0.01393570973961571\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/croissantllm/CroissantCool-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|arc:challenge|25_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|gsm8k|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hellaswag|10_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T00-48-36.907301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T00-48-36.907301.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- '**/details_harness|winogrande|5_2024-04-03T00-48-36.907301.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T00-48-36.907301.parquet'
- config_name: results
data_files:
- split: 2024_04_03T00_48_36.907301
path:
- results_2024-04-03T00-48-36.907301.parquet
- split: latest
path:
- results_2024-04-03T00-48-36.907301.parquet
---
# Dataset Card for Evaluation run of croissantllm/CroissantCool-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [croissantllm/CroissantCool-v0.2](https://huggingface.co/croissantllm/CroissantCool-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_croissantllm__CroissantCool-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T00:48:36.907301](https://huggingface.co/datasets/open-llm-leaderboard/details_croissantllm__CroissantCool-v0.2/blob/main/results_2024-04-03T00-48-36.907301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2501316710730105,
"acc_stderr": 0.030599339857153456,
"acc_norm": 0.25175123828516227,
"acc_norm_stderr": 0.03140882407248125,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826844,
"mc2": 0.39340977778528863,
"mc2_stderr": 0.014574937058003997
},
"harness|arc:challenge|25": {
"acc": 0.28242320819112626,
"acc_stderr": 0.013155456884097225,
"acc_norm": 0.318259385665529,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.42202748456482775,
"acc_stderr": 0.004928735103635842,
"acc_norm": 0.545807608046206,
"acc_norm_stderr": 0.00496879680041041
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.026199808807561915,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.026199808807561915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02924188386962883,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02924188386962883
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217893,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924318,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924318
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2,
"acc_stderr": 0.022755204959542936,
"acc_norm": 0.2,
"acc_norm_stderr": 0.022755204959542936
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117447,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222724,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833713,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833713
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.032162984205936135,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.032162984205936135
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1944954128440367,
"acc_stderr": 0.016970289090458064,
"acc_norm": 0.1944954128440367,
"acc_norm_stderr": 0.016970289090458064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491652,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.02905858830374884,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.02905858830374884
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529619,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529619
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.014054314935614556,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.014054314935614556
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590627,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23728813559322035,
"acc_stderr": 0.010865436690780276,
"acc_norm": 0.23728813559322035,
"acc_norm_stderr": 0.010865436690780276
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225392,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225392
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826844,
"mc2": 0.39340977778528863,
"mc2_stderr": 0.014574937058003997
},
"harness|winogrande|5": {
"acc": 0.5643251775848461,
"acc_stderr": 0.01393570973961571
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Defalt-404/my-image-captioning-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: revised_prompt
dtype: string
- name: uid
dtype: int64
- name: incentive
dtype: float64
- name: provider
dtype: string
- name: seed
dtype: int64
- name: cfg_scale
dtype: int64
- name: steps
dtype: int64
- name: model
dtype: string
- name: style
dtype: string
- name: size
dtype: string
- name: quality
dtype: string
- name: is_available
dtype: bool
- name: hotkey
dtype: string
- name: coldkey
dtype: string
splits:
- name: train
num_bytes: 284471025.216
num_examples: 1007
download_size: 283770161
dataset_size: 284471025.216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jaskaransingh/Wine-Reviews | ---
license: apache-2.0
---
|
AdapterOcean/oasst_top1_standardized_cluster_3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 9159302
num_examples: 957
download_size: 2725712
dataset_size: 9159302
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst_top1_standardized_cluster_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MaxYuki/daiki | ---
license: openrail
--- |
hammh0a/SynthCLIP | ---
license: cc-by-nc-4.0
---
<p style="text-align:center; font-size:2em; font-weight:bold;">SynthCI-30M</p>
<div style="display: flex; justify-content: center; align-items: center; height: 100%;">
<img src="https://i.ibb.co/kDv612p/ef8b63cb-ce63-4246-8aab-6535711f61f5.webp" alt="Alt text" style="max-width:70%; height:auto;">
</div>
This repo contains SynthCI-30M which is the dataset proposed in "SynthCLIP: Are We Ready For a Fully Synthetic CLIP Training?". The dataset contains 30M synthetic text-image pairs covering a wide range of concepts.
<div style="text-align:center;">
<p><em>"We will reach a time where machines will create machines."</em></p>
</div>
## Abstract
We present SynthCLIP, a novel framework for training CLIP models with entirely synthetic text-image pairs, significantly departing from previous methods relying on real data. Leveraging recent text-to-image (TTI) generative networks and large language models (LLM), we are able to generate synthetic datasets of images and corresponding captions at any scale, with no human intervention. With training at scale, SynthCLIP achieves performance comparable to CLIP models trained on real datasets. We also introduce SynthCI-30M, a purely synthetic dataset comprising 30 million captioned images.
## Structure
* `SynthCI-30/combined_images_and_captions.csv` contains the image paths with corresponding captions
* `SynthCI-30/data` contains 3039 zip files each containing 10K images.
## Citation
```
@misc{hammoud2024synthclip,
title={SynthCLIP: Are We Ready for a Fully Synthetic CLIP Training?},
author={Hasan Abed Al Kader Hammoud and Hani Itani and Fabio Pizzati and Philip Torr and Adel Bibi and Bernard Ghanem},
year={2024},
eprint={2402.01832},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
|
zerolink/zsql-oracle-dpo | ---
dataset_info:
features:
- name: schema
dtype: string
- name: question
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
- name: weight
dtype: float64
splits:
- name: train
num_bytes: 250828367.48591948
num_examples: 234227
- name: test
num_bytes: 27870651.51408053
num_examples: 26026
download_size: 87153981
dataset_size: 278699019.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
nguyenthanhdo/retrieval-v1.0 | ---
dataset_info:
features:
- name: query
dtype: string
- name: answers
sequence: 'null'
- name: positive_passages
list:
- name: text
dtype: string
- name: title
dtype: string
- name: type
dtype: string
- name: negative_passages
list:
- name: text
dtype: string
- name: title
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 147378500.36674002
num_examples: 70000
- name: val
num_bytes: 5263517.870240714
num_examples: 2500
- name: test
num_bytes: 5231936.76301927
num_examples: 2485
download_size: 85666852
dataset_size: 157873955.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
swaroopajit/next-dataset-refined-batch-12000 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 337818757.0
num_examples: 1000
download_size: 312355831
dataset_size: 337818757.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "next-dataset-refined-batch-12000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EnergyStarAI/image_generation | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 104314
num_examples: 1000
download_size: 45770
dataset_size: 104314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cakiki/java_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 1168673674
num_examples: 43005815
download_size: 919178767
dataset_size: 1168673674
---
# Dataset Card for "java_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nodchip/shogi_suisho5_depth9_entering_king | ---
license: mit
---
# Summary
Training Data for Shogi AI Development
# Contents
- `kifu.tag=suisho5.entering_king.depth=9.num_positions=500000000.start_time=??????????.thread_index=???.bin` ... Training Data
The training data are provided in the YaneuraOu PackedSfenValue format.
This dataset was generated using Suisho5 with a search depth of 9. The starting positions were selectively chosen from Floodgate games spanning the years 2015 to 2024, specifically focusing on scenarios involving entering king positions.
The training data have **not** been shuffled. It is recommended to shuffle the training data before use. Additionally, positions within this dataset have **not** been replaced with the PV (Principal Variation) leaf node from the quiescence search of the original position. Users are advised to consider applying such modifications as needed for their specific use case.
# Links
- nodchip/tanuki-: shogi engine(AI player), stronger than Bonanza6 , educational and tiny code(about 2500 lines) , USI compliant engine , capable of being compiled by VC++2015 https://github.com/nodchip/tanuki-
|
faisalq/prose | ---
license: afl-3.0
---
|
pietrolesci/pubmed-20k-rct | ---
task_categories:
- text-classification
language:
- en
dataset_info:
features:
- name: abstract_id
dtype: string
- name: labels
dtype:
class_label:
names:
'0': background
'1': conclusions
'2': methods
'3': objective
'4': results
- name: text
dtype: string
- name: sentence_id
dtype: int64
- name: uid
dtype: int64
- name: embedding_all-mpnet-base-v2
sequence: float32
- name: embedding_multi-qa-mpnet-base-dot-v1
sequence: float32
- name: embedding_all-MiniLM-L12-v2
sequence: float32
splits:
- name: train
num_bytes: 1392522399
num_examples: 176642
- name: validation
num_bytes: 233905609
num_examples: 29672
- name: test
num_bytes: 233146005
num_examples: 29578
download_size: 0
dataset_size: 1859574013
---
This is the same dataset as [`armanc/pubmed-rct20k`](https://huggingface.co/datasets/armanc/pubmed-rct20k).
The only differences are
1. Addition of a unique identifier, `uid`
1. Addition of the indices, that is 3 columns with the embeddings of 3 different sentence-transformers
- `all-mpnet-base-v2`
- `multi-qa-mpnet-base-dot-v1`
- `all-MiniLM-L12-v2`
1. Renaming of the `label` column to `labels` for easier compatibility with the transformers library |
result-kand2-sdxl-wuerst-karlo/144daf3b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 174
num_examples: 10
download_size: 1351
dataset_size: 174
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "144daf3b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/fuego-20230322-205020-91962a | ---
tags:
- fuego
fuego:
id: 20230322-205020-91962a
status: done
script: script.py
requirements_file: requirements.txt
space_id: davanstrien/fuego-20230322-205020-91962a
space_hardware: cpu-basic
---
|
CODIS/CODIS | ---
license: apache-2.0
task_categories:
- visual-question-answering
language:
- en
pretty_name: CODIS
size_categories:
- n<1K
---
# CODIS: Benchmarking Context-Dependent Visual Comprehension for Multimodal Large Language Models
[**🌐 Homepage**](https://thunlp-mt.github.io/CODIS) | [**📖 arXiv**](https://arxiv.org/abs/2402.13607) | [**Github**](https://github.com/THUNLP-MT/CODIS)
Dataset for paper [CODIS: Benchmarking Context-Dependent Visual Comprehension for Multimodal Large Language Models](https://arxiv.org/abs/2402.13607).
## Introduction
In certain situations, images need to be interpreted within a broader context. We introduce a new benchmark, named as **CODIS** (**CO**ntext-**D**ependent **I**mage di**S**ambiguation), designed to assess the ability of models to use context provided in free-form text to enhance visual comprehension.
- Each image in CODIS contains inherent ambiguity that can only be resolved with additional context.
- The questions are deliberately designed to highlight these ambiguities, requiring external context for accurate interpretation.
- For every image-question pair, we provide two contexts in a free-form text format.
## Leaderboard
We report results of human and MLLMs. Models score only if their answers to a pair of queries are both correct. The results are based on human evaluation.
| Model | Loc & Ori | Temporal | Cultural | Attributes | Relationships | Average |
|----------------|:-------------:|:-------------:|:-------------:|:-------------:|:-------------:|:-------------:|
Human | 85.2 | 90.9 | 72.8 | 87.2 | 89.6 | 86.2 |
GPT4-V | 33.3 | 28.4 | 25.5 | 26.7 | 51.9 | 32.3 |
Gemini | 21.4 | 29.5 | 21.3 | 24.0 | 34.6 | 26.1 |
LLaVA-1.5-13B | 6.0 | 4.2 | 10.6 | 14.7 | 13.5 | 9.1 |
BLIP-2-11B | 6.0 | 8.4 | 4.3 | 6.7 | 11.5 | 7.4 |
InstructBLIP-13B | 6.0 | 2.1 | 4.3 | 4.0 | 7.7 | 4.5 |
mPLUG-Owl-2-7B | 13.1 | 9.5 | 6.4 | 12.0 | 19.2 | 11.9 |
MiniGPT4-7B | 10.7 | 3.2 | 0.0 | 12.0 | 13.5 | 7.9 |
LLaVA-1.5-7B | 11.9 | 5.3 | 4.3 | 9.3 | 7.7 | 7.9 |
InstructBLIP-7B | 1.2 | 7.4 | 0.0 | 4.0 | 11.5 | 4.8 |
Otter-7B | 2.4 | 5.3 | 4.3 | 0.0 | 5.8 | 3.4 |
LLaVA-7B | 2.4 | 6.3 | 0.0 | 1.3 | 5.8 | 3.4 |
Qwen-VL-Chat | 3.6 | 3.2 | 0.0 | 1.3 | 9.6 | 3.4 |
OpenFlamingo-7B | 2.4 | 2.1 | 0.0 | 5.3 | 5.8 | 3.1 |
BLIP-2-6.7B | 0.0 | 1.1 | 2.1 | 2.7 | 7.7 | 2.3 |
## Citation
```bibtex
@article{luo2024codis,
title={CODIS: Benchmarking Context-Dependent Visual Comprehension for Multimodal Large Language Models},
author={Fuwen Luo and Chi Chen and Zihao Wan and Zhaolu Kang and Qidong Yan and Yingjie Li and Xiaolong Wang and Siyu Wang and Ziyue Wang and Xiaoyue Mi and Peng Li and Ning Ma and Maosong Sun and Yang Liu},
journal={arXiv preprint arXiv:2402.13607},
year={2024}
}
```
|
roa7n/patched_1000_test_p_150_m2_predictions | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: features
sequence: float64
- name: m2_preds
dtype: float32
splits:
- name: train
num_bytes: 9279744396
num_examples: 1035692
download_size: 8814051480
dataset_size: 9279744396
---
# Dataset Card for "patched_1000_test_p_150_m2_predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_de_dev | ---
pretty_name: '`mmarco/de/dev`'
viewer: false
source_datasets: ['irds/mmarco_de']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/de/dev`
The `mmarco/de/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/de/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,093
- `qrels`: (relevance assessments); count=59,273
- For `docs`, use [`irds/mmarco_de`](https://huggingface.co/datasets/irds/mmarco_de)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_de_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_de_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
WorkWithData/Artists | ---
license: cc-by-4.0
---
This dataset contains artists from around the world. The data comes from open data sources (MoMA, Tate, Rijks, Interpol). The dataset can also be found on: https://www.workwithdata.com/dataset?entity=artists
Similar datasets can also be found on: https://www.workwithdata.com |
se2p/code-readability-merged | ---
language:
- en
license: unknown
size_categories:
- n<1K
task_categories:
- text-classification
pretty_name: Java Code Readability Merged Dataset
tags:
- readability
- code
- source code
- code readability
- Java
features:
- name: code_snippet
dtype: string
- name: score
dtype: float
dataset_info:
features:
- name: code_snippet
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 354539
num_examples: 421
download_size: 139793
dataset_size: 354539
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Java Code Readability Merged Dataset
This dataset contains **421 Java code snippets** along with a **readability score**, aggregated from several scientific papers [1, 2, 3].
You can download the dataset using Hugging Face:
```python
from datasets import load_dataset
ds = load_dataset("se2p/code-readability-merged")
```
The snippets are **not** split into train and test (and validation) set. Thus, the whole dataset is in the **train** set:
```python
ds = ds['train']
ds_as_list = ds.to_list() # Convert the dataset to whatever format suits you best
```
The dataset is structured as follows:
```json
{
"code_snippet": ..., # Java source code snippet
"score": ... # Readability score
}
```
The main goal of this repository is to train code **readability classifiers for Java source code**.
The dataset is a combination and normalization of three datasets:
1. **Buse**, R. P., & Weimer, W. R. (2009). Learning a metric for code readability. IEEE Transactions on software engineering, 36(4), 546-558.
2. **Dorn**, J. (2012). A General Software Readability Model.
3. **Scalabrino**, S., Linares‐Vásquez, M., Oliveto, R., & Poshyvanyk, D. (2018). A comprehensive model for code readability. Journal of Software: Evolution and Process, 30(6), e1958.
The raw datasets can be downloaded [here](https://dibt.unimol.it/report/readability/).
## Dataset Details
### Dataset Description
- **Curated by:** Buse Raymond PL, Dorn Jonathan, Sclabrino Simone
- **Shared by:** Krodinger Lukas
- **Language(s) (NLP):** Java
- **License:** Unknown
## Uses
The dataset can be used for training Java code readability classifiers.
## Dataset Structure
Each entry of the dataset consists of a **code_snippet** and a **score**.
The code_snippet (string) is the code snippet that was rated in a study by multiple participants.
Those could answer based on a five point Likert scale, with 1 being very unreadable and 5 being very readable.
The score (float) is the averaged rating score of all participants between 1.0 (very unreadable) and 5.0 (very readable).
## Dataset Creation
### Curation Rationale
To advance code readability classification, the creation of datasets in this research field is of high importance.
As a first step, we provide a merged and normalized version of existing datasets on Hugging Face.
This makes access and ease of usage of this existing data easier.
### Source Data
The source of the data are the papers from Buse, Dorn and Scalabrino.
Buse conducted a survey with 120 computer science students (17 from first year courses, 63 from second year courses, 30 third or fourth year courses, 10 graduated) on 100 code snippets.
The code snippets were generated from five open source Java projects.
Dorn conducted a survey with 5000 participants (1800 with industry experience) on 360 code snippets from which 121 are Java code snippets.
The used snippets were drawn from ten open source projects in the SourceForge repository (of March 15, 2012).
Scalabrino conducted a survey with 9 computer science students on 200 new code snippets.
The snippets were selected from four open source Java projects: jUnit, Hibernate, jFreeChart and ArgoUML.
#### Data Collection and Processing
The dataset was preprocessed by **averaging the readability rating** for each code snippet.
The code snippets and ratings were then **merged** from the three sources.
Each of the three, Buse, Dorn and Sclabrino selected their code snippets based on different criteria.
They had a different number of participants for their surveys.
One could argue that a code snippet that was rated by more participants might have a more accurate readability score and therefore is more valuable than one with less ratings.
However, for simplicity those differences are ignored.
Other than the selection (and generation) done by the original data source authors, no further processing is applied to the data.
#### Who are the source data producers?
The source data producers are the people that wrote the used open source Java projects, as well as the study participants, which were mostly computer science students.
#### Personal and Sensitive Information
The ratings of the code snippets are anonymized and averaged. Thus, no personal or sensitive information is contained in this dataset.
## Bias, Risks, and Limitations
The size of the dataset is very **small**.
The ratings of code snippets were done mostly by **computer science students**, who do not represent the group of Java programmers in general.
### Recommendations
The dataset should be used to train **small** Java code readability classifiers.
## Citation
1. Buse, R. P., & Weimer, W. R. (2009). Learning a metric for code readability. IEEE Transactions on software engineering, 36(4), 546-558.
2. Dorn, J. (2012). A General Software Readability Model.
3. Scalabrino, S., Linares‐Vásquez, M., Oliveto, R., & Poshyvanyk, D. (2018). A comprehensive model for code readability. Journal of Software: Evolution and Process, 30(6), e1958.
```bibtex
@article{buse2009learning,
title={Learning a metric for code readability},
author={Buse, Raymond PL and Weimer, Westley R},
journal={IEEE Transactions on software engineering},
volume={36},
number={4},
pages={546--558},
year={2009},
publisher={IEEE}
}
@inproceedings{dorn2012general,
title={A General Software Readability Model},
author={Jonathan Dorn},
year={2012},
url={https://api.semanticscholar.org/CorpusID:14098740}
}
@article{scalabrino2018comprehensive,
title={A comprehensive model for code readability},
author={Scalabrino, Simone and Linares-V{\'a}squez, Mario and Oliveto, Rocco and Poshyvanyk, Denys},
journal={Journal of Software: Evolution and Process},
volume={30},
number={6},
pages={e1958},
year={2018},
publisher={Wiley Online Library}
}
```
## Dataset Card Authors
Lukas Krodinger, [Chair of Software Engineering II](https://www.fim.uni-passau.de/en/chair-for-software-engineering-ii), [University of Passau](https://www.uni-passau.de/en/).
## Dataset Card Contact
Feel free to contact me via [E-Mail](mailto:krodin03@ads.uni-passau.de) if you have any questions or remarks. |
Falah/avatar_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1508885
num_examples: 5000
download_size: 203485
dataset_size: 1508885
---
# Dataset Card for "avatar_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LLMLegalAssistant/eur-lex-2023_24 | ---
license: mit
---
|
open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m-instruct | ---
pretty_name: Evaluation run of AI-Sweden-Models/gpt-sw3-126m-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AI-Sweden-Models/gpt-sw3-126m-instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-06T17:24:12.655083](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m-instruct/blob/main/results_2023-12-06T17-24-12.655083.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23830785704981358,\n\
\ \"acc_stderr\": 0.030091824926520686,\n \"acc_norm\": 0.23875634303781132,\n\
\ \"acc_norm_stderr\": 0.030838942303782275,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299965,\n \"mc2\": 0.42647039245716606,\n\
\ \"mc2_stderr\": 0.014756647007334998\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702254,\n\
\ \"acc_norm\": 0.23378839590443687,\n \"acc_norm_stderr\": 0.012368225378507144\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28360884285998805,\n\
\ \"acc_stderr\": 0.004498280244494508,\n \"acc_norm\": 0.29884485162318264,\n\
\ \"acc_norm_stderr\": 0.004568161710399553\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313143,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313143\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310052,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310052\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749895,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749895\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.034550710191021475,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.034550710191021475\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n\
\ \"acc_stderr\": 0.024137632429337707,\n \"acc_norm\": 0.23548387096774193,\n\
\ \"acc_norm_stderr\": 0.024137632429337707\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.027719315709614778,\n\
\ \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.027719315709614778\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047707,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229876,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229876\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860667,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860667\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.02047323317355198,\n\
\ \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.02047323317355198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786382,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786382\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790236,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790236\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n\
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.14563106796116504,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.14563106796116504,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1752136752136752,\n\
\ \"acc_stderr\": 0.024904439098918214,\n \"acc_norm\": 0.1752136752136752,\n\
\ \"acc_norm_stderr\": 0.024904439098918214\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24393358876117496,\n\
\ \"acc_stderr\": 0.015357212665829479,\n \"acc_norm\": 0.24393358876117496,\n\
\ \"acc_norm_stderr\": 0.015357212665829479\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071145,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071145\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.19934640522875818,\n \"acc_stderr\": 0.022875816993464054,\n\
\ \"acc_norm\": 0.19934640522875818,\n \"acc_norm_stderr\": 0.022875816993464054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19614147909967847,\n\
\ \"acc_stderr\": 0.022552447780478022,\n \"acc_norm\": 0.19614147909967847,\n\
\ \"acc_norm_stderr\": 0.022552447780478022\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.02624492034984301,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.02624492034984301\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443738,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443738\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528023,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528023\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.029393609319879815,\n\
\ \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.029393609319879815\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.033014059469872487,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.033014059469872487\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299965,\n \"mc2\": 0.42647039245716606,\n\
\ \"mc2_stderr\": 0.014756647007334998\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.48539857932123126,\n \"acc_stderr\": 0.014046492383275839\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \
\ \"acc_stderr\": 0.0027210765770416625\n }\n}\n```"
repo_url: https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|arc:challenge|25_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|gsm8k|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hellaswag|10_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-24-12.655083.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T17-24-12.655083.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- '**/details_harness|winogrande|5_2023-12-06T17-24-12.655083.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-06T17-24-12.655083.parquet'
- config_name: results
data_files:
- split: 2023_12_06T17_24_12.655083
path:
- results_2023-12-06T17-24-12.655083.parquet
- split: latest
path:
- results_2023-12-06T17-24-12.655083.parquet
---
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-126m-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-126m-instruct](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-06T17:24:12.655083](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m-instruct/blob/main/results_2023-12-06T17-24-12.655083.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23830785704981358,
"acc_stderr": 0.030091824926520686,
"acc_norm": 0.23875634303781132,
"acc_norm_stderr": 0.030838942303782275,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299965,
"mc2": 0.42647039245716606,
"mc2_stderr": 0.014756647007334998
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702254,
"acc_norm": 0.23378839590443687,
"acc_norm_stderr": 0.012368225378507144
},
"harness|hellaswag|10": {
"acc": 0.28360884285998805,
"acc_stderr": 0.004498280244494508,
"acc_norm": 0.29884485162318264,
"acc_norm_stderr": 0.004568161710399553
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313143,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313143
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310052,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310052
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749895,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749895
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.034550710191021475,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.034550710191021475
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.024137632429337707,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.024137632429337707
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.027719315709614778,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.027719315709614778
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047707,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229876,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229876
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860667,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860667
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02047323317355198,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02047323317355198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786382,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02755361446786382
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790236,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790236
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.14563106796116504,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.14563106796116504,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1752136752136752,
"acc_stderr": 0.024904439098918214,
"acc_norm": 0.1752136752136752,
"acc_norm_stderr": 0.024904439098918214
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24393358876117496,
"acc_stderr": 0.015357212665829479,
"acc_norm": 0.24393358876117496,
"acc_norm_stderr": 0.015357212665829479
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071145,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071145
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.19934640522875818,
"acc_stderr": 0.022875816993464054,
"acc_norm": 0.19934640522875818,
"acc_norm_stderr": 0.022875816993464054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19614147909967847,
"acc_stderr": 0.022552447780478022,
"acc_norm": 0.19614147909967847,
"acc_norm_stderr": 0.022552447780478022
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.02624492034984301,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.02624492034984301
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443738,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528023,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528023
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3020408163265306,
"acc_stderr": 0.029393609319879815,
"acc_norm": 0.3020408163265306,
"acc_norm_stderr": 0.029393609319879815
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.033014059469872487,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.033014059469872487
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299965,
"mc2": 0.42647039245716606,
"mc2_stderr": 0.014756647007334998
},
"harness|winogrande|5": {
"acc": 0.48539857932123126,
"acc_stderr": 0.014046492383275839
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416625
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DianaJin/nuna | ---
dataset_info:
features:
- name: audio
struct:
- name: array
dtype: 'null'
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1198475
num_examples: 1732
- name: test
num_bytes: 151376
num_examples: 217
- name: valid
num_bytes: 146396
num_examples: 217
download_size: 688378
dataset_size: 1496247
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
mehranandi/Forfirstai | ---
license: apache-2.0
---
|
fededeleon/CriteriosClasificacion | ---
license: mit
---
|
5CD-AI/Vietnamese-1m5-kaist-CoT-gg-translated-unrefined | ---
task_categories:
- question-answering
language:
- vi
- en
--- |
grail_qa | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids: []
paperswithcode_id: null
pretty_name: Grail QA
tags:
- knowledge-base-qa
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: answer
sequence:
- name: answer_type
dtype: string
- name: answer_argument
dtype: string
- name: entity_name
dtype: string
- name: function
dtype: string
- name: num_node
dtype: int32
- name: num_edge
dtype: int32
- name: graph_query
struct:
- name: nodes
sequence:
- name: nid
dtype: int32
- name: node_type
dtype: string
- name: id
dtype: string
- name: class
dtype: string
- name: friendly_name
dtype: string
- name: question_node
dtype: int32
- name: function
dtype: string
- name: edges
sequence:
- name: start
dtype: int32
- name: end
dtype: int32
- name: relation
dtype: string
- name: friendly_name
dtype: string
- name: sparql_query
dtype: string
- name: domains
sequence: string
- name: level
dtype: string
- name: s_expression
dtype: string
splits:
- name: train
num_bytes: 69433121
num_examples: 44337
- name: validation
num_bytes: 9800544
num_examples: 6763
- name: test
num_bytes: 2167256
num_examples: 13231
download_size: 17636773
dataset_size: 81400921
---
# Dataset Card for Grail QA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Grail QA](https://dki-lab.github.io/GrailQA/)
- **Repository:**
- **Paper:** [GrailQA paper (Gu et al. '20)](https://arxiv.org/abs/2011.07743)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
#### What is GrailQA?
Strongly Generalizable Question Answering (GrailQA) is a new large-scale, high-quality dataset for question answering on knowledge bases (KBQA) on Freebase with 64,331 questions annotated with both answers and corresponding logical forms in different syntax (i.e., SPARQL, S-expression, etc.). It can be used to test three levels of generalization in KBQA: i.i.d., compositional, and zero-shot.
#### Why GrailQA?
GrailQA is by far the largest crowdsourced KBQA dataset with questions of high diversity (i.e., questions in GrailQA can have up to 4 relations and optionally have a function from counting, superlatives and comparatives). It also has the highest coverage over Freebase; it widely covers 3,720 relations and 86 domains from Freebase. Last but not least, our meticulous data split allows GrailQA to test not only i.i.d. generalization, but also compositional generalization and zero-shot generalization, which are critical for practical KBQA systems.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English and Graph query
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
- `qid` (`str`)
- `question` (`str`)
- `answer` (`List`): Defaults to `[]` in test split.
- `answer_type` (`str`)
- `answer_argument` (`str`)
- `entity_name` (`str`): Defauts to `""` if `answer_type` is not `Entity`.
- `function` (`string`): Defaults to `""` in test split.
- `num_node` (`int`): Defaults to `-1` in test split.
- `num_edge` (`int`): Defaults to `-1` in test split.
- `graph_query` (`Dict`)
- `nodes` (`List`): Defaults to `[]` in test split.
- `nid` (`int`)
- `node_type` (`str`)
- `id` (`str`)
- `class` (`str`)
- `friendly_name` (`str`)
- `question_node` (`int`)
- `function` (`str`)
- `edges` (`List`): Defaults to `[]` in test split.
- `start` (`int`)
- `end` (`int`)
- `relation` (`str`)
- `friendly_name` (`str`)
- `sqarql_query` (`str`): Defaults to `""` in test split.
- `domains` (`List[str]`): Defaults to `[]` in test split.
- `level` (`str`): Only available in validation split. Defaults to `""` in others.
- `s_expression` (`str`): Defaults to `""` in test split.
**Notes:** Only `qid` and `question` available in test split.
### Data Splits
Dataset Split | Number of Instances in Split
--------------|--------------------------------------------
Train | 44,337
Validation | 6,763
Test | 13,231
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@mattbui](https://github.com/mattbui) for adding this dataset. |
rdev12/test_corpus | ---
dataset_info:
features:
- name: doc_id
dtype: int64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 7580154
num_examples: 9680
download_size: 4516561
dataset_size: 7580154
---
# Dataset Card for "test_corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
udmurtNLP/soviet-geography-book-rus-udm-parallel-corpora | ---
dataset_info:
features:
- name: rus
dtype: string
- name: udm
dtype: string
splits:
- name: train
num_bytes: 604740
num_examples: 2783
download_size: 298539
dataset_size: 604740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
language:
- udm
task_categories:
- translation
size_categories:
- 1K<n<10K
---
Data taken from http://comp.komicorpora.ru/
Book: Л. Г. Терехова и В. Г. Эрдели. География. Часть первая (1938) |
dhuynh95/Magicoder-Evol-Instruct-110K-Filtered | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10675290
num_examples: 5000
download_size: 5542965
dataset_size: 10675290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
supermomo668/Llama2D-Pretrain | ---
dataset_info:
features:
- name: input_ids
sequence: float32
- name: coords
sequence:
sequence: float32
- name: labels
sequence: float32
- name: attention_mask
sequence: float32
splits:
- name: train
num_bytes: 14226320
num_examples: 395
download_size: 834338
dataset_size: 14226320
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Llama2D-Pretrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
linhtran92/infer_on_testds_v1 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371437.027
num_examples: 1299
download_size: 164200336
dataset_size: 174371437.027
---
# Dataset Card for "infer_on_testds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hf-internal-testing/audiofolder_two_configs_in_metadata | ---
configs:
- config_name: v1
data_dir: v1
drop_labels: true
- config_name: v2
data_dir: v2
drop_labels: false
duplicated_from: polinaeterna/audiofolder_two_configs_in_metadata
---
|
Deojoandco/ah | ---
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: string
- name: num_comments
dtype: int64
- name: name
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: score
dtype: int64
- name: upvote_ratio
dtype: float64
- name: distinguished
dtype: 'null'
- name: over_18
dtype: bool
- name: created_utc
dtype: float64
- name: comments
list:
- name: body
dtype: string
- name: created_utc
dtype: float64
- name: distinguished
dtype: 'null'
- name: id
dtype: string
- name: permalink
dtype: string
- name: score
dtype: int64
- name: best_num_comments
dtype: int64
splits:
- name: train
num_bytes: 90583
num_examples: 26
download_size: 75015
dataset_size: 90583
---
# Dataset Card for "ah"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abelbuba/ai-journals | ---
license: apache-2.0
---
|
Dahoas/MATH_full_chat_format | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 12599878
num_examples: 7500
- name: test
num_bytes: 7885936
num_examples: 5000
download_size: 9829571
dataset_size: 20485814
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
KeKu/poetry | ---
license: llama2
---
|
G12345/sidewalk-imagery17 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3138225.0
num_examples: 10
download_size: 0
dataset_size: 3138225.0
---
# Dataset Card for "sidewalk-imagery17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AA051611__V0201 | ---
pretty_name: Evaluation run of AA051611/V0201
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051611/V0201](https://huggingface.co/AA051611/V0201) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__V0201\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T03:15:18.446534](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0201/blob/main/results_2024-02-02T03-15-18.446534.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8722899277105903,\n\
\ \"acc_stderr\": 0.021779827433248626,\n \"acc_norm\": 0.8832174168880055,\n\
\ \"acc_norm_stderr\": 0.022071903413890245,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.5375985523274007,\n\
\ \"mc2_stderr\": 0.015202763451961539\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.014077223108470139,\n\
\ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6309500099581756,\n\
\ \"acc_stderr\": 0.004815613144385407,\n \"acc_norm\": 0.8330013941445927,\n\
\ \"acc_norm_stderr\": 0.00372212370961046\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.837037037037037,\n\
\ \"acc_stderr\": 0.03190541474482841,\n \"acc_norm\": 0.837037037037037,\n\
\ \"acc_norm_stderr\": 0.03190541474482841\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9539473684210527,\n \"acc_stderr\": 0.01705693362806048,\n\
\ \"acc_norm\": 0.9539473684210527,\n \"acc_norm_stderr\": 0.01705693362806048\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.86,\n\
\ \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \
\ \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.9358490566037736,\n \"acc_stderr\": 0.015080038966069792,\n\
\ \"acc_norm\": 0.9358490566037736,\n \"acc_norm_stderr\": 0.015080038966069792\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9652777777777778,\n\
\ \"acc_stderr\": 0.01530953117500374,\n \"acc_norm\": 0.9652777777777778,\n\
\ \"acc_norm_stderr\": 0.01530953117500374\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.86,\n \"acc_stderr\": 0.03487350880197772,\n \"acc_norm\": 0.86,\n\
\ \"acc_norm_stderr\": 0.03487350880197772\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8786127167630058,\n\
\ \"acc_stderr\": 0.024901248066383764,\n \"acc_norm\": 0.8786127167630058,\n\
\ \"acc_norm_stderr\": 0.024901248066383764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.91,\n \"acc_stderr\": 0.028762349126466115,\n \"acc_norm\": 0.91,\n\
\ \"acc_norm_stderr\": 0.028762349126466115\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8936170212765957,\n \"acc_stderr\": 0.02015597730704985,\n\
\ \"acc_norm\": 0.8936170212765957,\n \"acc_norm_stderr\": 0.02015597730704985\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7894736842105263,\n\
\ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.7894736842105263,\n\
\ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8896551724137931,\n \"acc_stderr\": 0.026109923428966807,\n\
\ \"acc_norm\": 0.8896551724137931,\n \"acc_norm_stderr\": 0.026109923428966807\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.8862433862433863,\n \"acc_stderr\": 0.016352876480494796,\n \"\
acc_norm\": 0.8862433862433863,\n \"acc_norm_stderr\": 0.016352876480494796\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.7301587301587301,\n\
\ \"acc_stderr\": 0.03970158273235171,\n \"acc_norm\": 0.7301587301587301,\n\
\ \"acc_norm_stderr\": 0.03970158273235171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747115,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747115\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9612903225806452,\n \"acc_stderr\": 0.010973819726797958,\n \"\
acc_norm\": 0.9612903225806452,\n \"acc_norm_stderr\": 0.010973819726797958\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.8078817733990148,\n \"acc_stderr\": 0.02771931570961478,\n \"\
acc_norm\": 0.8078817733990148,\n \"acc_norm_stderr\": 0.02771931570961478\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466115,\n \"acc_norm\"\
: 0.91,\n \"acc_norm_stderr\": 0.028762349126466115\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9212121212121213,\n \"acc_stderr\": 0.021037183825716357,\n\
\ \"acc_norm\": 0.9212121212121213,\n \"acc_norm_stderr\": 0.021037183825716357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9646464646464646,\n \"acc_stderr\": 0.01315731887804608,\n \"\
acc_norm\": 0.9646464646464646,\n \"acc_norm_stderr\": 0.01315731887804608\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084346,\n\
\ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.9128205128205128,\n \"acc_stderr\": 0.014302931207177386,\n\
\ \"acc_norm\": 0.9128205128205128,\n \"acc_norm_stderr\": 0.014302931207177386\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.7888888888888889,\n \"acc_stderr\": 0.024882116857655078,\n \
\ \"acc_norm\": 0.7888888888888889,\n \"acc_norm_stderr\": 0.024882116857655078\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.9411764705882353,\n \"acc_stderr\": 0.015283995352038426,\n\
\ \"acc_norm\": 0.9411764705882353,\n \"acc_norm_stderr\": 0.015283995352038426\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.7682119205298014,\n \"acc_stderr\": 0.03445406271987054,\n \"\
acc_norm\": 0.7682119205298014,\n \"acc_norm_stderr\": 0.03445406271987054\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9743119266055046,\n \"acc_stderr\": 0.006782898624451454,\n \"\
acc_norm\": 0.9743119266055046,\n \"acc_norm_stderr\": 0.006782898624451454\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02541642838876747,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02541642838876747\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9705882352941176,\n \"acc_stderr\": 0.011858507536737417,\n \"\
acc_norm\": 0.9705882352941176,\n \"acc_norm_stderr\": 0.011858507536737417\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370188,\n \
\ \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370188\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8923766816143498,\n\
\ \"acc_stderr\": 0.020799400082880004,\n \"acc_norm\": 0.8923766816143498,\n\
\ \"acc_norm_stderr\": 0.020799400082880004\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n\
\ \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9173553719008265,\n \"acc_stderr\": 0.02513538235660422,\n \"\
acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.02513538235660422\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9629629629629629,\n\
\ \"acc_stderr\": 0.018257067489429676,\n \"acc_norm\": 0.9629629629629629,\n\
\ \"acc_norm_stderr\": 0.018257067489429676\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9447852760736196,\n \"acc_stderr\": 0.017944712448654636,\n\
\ \"acc_norm\": 0.9447852760736196,\n \"acc_norm_stderr\": 0.017944712448654636\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.8392857142857143,\n\
\ \"acc_stderr\": 0.034859460964757415,\n \"acc_norm\": 0.8392857142857143,\n\
\ \"acc_norm_stderr\": 0.034859460964757415\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9611650485436893,\n \"acc_stderr\": 0.019129793517354922,\n\
\ \"acc_norm\": 0.9611650485436893,\n \"acc_norm_stderr\": 0.019129793517354922\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9829059829059829,\n\
\ \"acc_stderr\": 0.008491806622565604,\n \"acc_norm\": 0.9829059829059829,\n\
\ \"acc_norm_stderr\": 0.008491806622565604\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429086,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9578544061302682,\n\
\ \"acc_stderr\": 0.007184928704935858,\n \"acc_norm\": 0.9578544061302682,\n\
\ \"acc_norm_stderr\": 0.007184928704935858\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8872832369942196,\n \"acc_stderr\": 0.017026126074681635,\n\
\ \"acc_norm\": 0.8872832369942196,\n \"acc_norm_stderr\": 0.017026126074681635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.9050279329608939,\n\
\ \"acc_stderr\": 0.009805284011337068,\n \"acc_norm\": 0.9050279329608939,\n\
\ \"acc_norm_stderr\": 0.009805284011337068\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.014476405218161428,\n\
\ \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.014476405218161428\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8971061093247589,\n\
\ \"acc_stderr\": 0.017255830051445344,\n \"acc_norm\": 0.8971061093247589,\n\
\ \"acc_norm_stderr\": 0.017255830051445344\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.9197530864197531,\n \"acc_stderr\": 0.015116405542849367,\n\
\ \"acc_norm\": 0.9197530864197531,\n \"acc_norm_stderr\": 0.015116405542849367\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.7872340425531915,\n \"acc_stderr\": 0.024414612974307713,\n \
\ \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.024414612974307713\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8305084745762712,\n\
\ \"acc_stderr\": 0.009582414456640188,\n \"acc_norm\": 0.8305084745762712,\n\
\ \"acc_norm_stderr\": 0.009582414456640188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.9411764705882353,\n \"acc_stderr\": 0.014293099746606797,\n\
\ \"acc_norm\": 0.9411764705882353,\n \"acc_norm_stderr\": 0.014293099746606797\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.9019607843137255,\n \"acc_stderr\": 0.012030208014297142,\n \
\ \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.012030208014297142\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8454545454545455,\n\
\ \"acc_stderr\": 0.03462262571262667,\n \"acc_norm\": 0.8454545454545455,\n\
\ \"acc_norm_stderr\": 0.03462262571262667\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.9061224489795918,\n \"acc_stderr\": 0.018671508543506656,\n\
\ \"acc_norm\": 0.9061224489795918,\n \"acc_norm_stderr\": 0.018671508543506656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9751243781094527,\n\
\ \"acc_stderr\": 0.011012907274218229,\n \"acc_norm\": 0.9751243781094527,\n\
\ \"acc_norm_stderr\": 0.011012907274218229\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \
\ \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.7168674698795181,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.7168674698795181,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9298245614035088,\n \"acc_stderr\": 0.019591541754525123,\n\
\ \"acc_norm\": 0.9298245614035088,\n \"acc_norm_stderr\": 0.019591541754525123\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.5375985523274007,\n\
\ \"mc2_stderr\": 0.015202763451961539\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.535253980288097,\n \
\ \"acc_stderr\": 0.013738207990177317\n }\n}\n```"
repo_url: https://huggingface.co/AA051611/V0201
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-15-18.446534.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- '**/details_harness|winogrande|5_2024-02-02T03-15-18.446534.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T03-15-18.446534.parquet'
- config_name: results
data_files:
- split: 2024_02_02T03_15_18.446534
path:
- results_2024-02-02T03-15-18.446534.parquet
- split: latest
path:
- results_2024-02-02T03-15-18.446534.parquet
---
# Dataset Card for Evaluation run of AA051611/V0201
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/V0201](https://huggingface.co/AA051611/V0201) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__V0201",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T03:15:18.446534](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__V0201/blob/main/results_2024-02-02T03-15-18.446534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8722899277105903,
"acc_stderr": 0.021779827433248626,
"acc_norm": 0.8832174168880055,
"acc_norm_stderr": 0.022071903413890245,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.5375985523274007,
"mc2_stderr": 0.015202763451961539
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.014077223108470139,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6309500099581756,
"acc_stderr": 0.004815613144385407,
"acc_norm": 0.8330013941445927,
"acc_norm_stderr": 0.00372212370961046
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.837037037037037,
"acc_stderr": 0.03190541474482841,
"acc_norm": 0.837037037037037,
"acc_norm_stderr": 0.03190541474482841
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9539473684210527,
"acc_stderr": 0.01705693362806048,
"acc_norm": 0.9539473684210527,
"acc_norm_stderr": 0.01705693362806048
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.9358490566037736,
"acc_stderr": 0.015080038966069792,
"acc_norm": 0.9358490566037736,
"acc_norm_stderr": 0.015080038966069792
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9652777777777778,
"acc_stderr": 0.01530953117500374,
"acc_norm": 0.9652777777777778,
"acc_norm_stderr": 0.01530953117500374
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197772,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197772
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.8786127167630058,
"acc_stderr": 0.024901248066383764,
"acc_norm": 0.8786127167630058,
"acc_norm_stderr": 0.024901248066383764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466115,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466115
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8936170212765957,
"acc_stderr": 0.02015597730704985,
"acc_norm": 0.8936170212765957,
"acc_norm_stderr": 0.02015597730704985
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8896551724137931,
"acc_stderr": 0.026109923428966807,
"acc_norm": 0.8896551724137931,
"acc_norm_stderr": 0.026109923428966807
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.8862433862433863,
"acc_stderr": 0.016352876480494796,
"acc_norm": 0.8862433862433863,
"acc_norm_stderr": 0.016352876480494796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.7301587301587301,
"acc_stderr": 0.03970158273235171,
"acc_norm": 0.7301587301587301,
"acc_norm_stderr": 0.03970158273235171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747115,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747115
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9612903225806452,
"acc_stderr": 0.010973819726797958,
"acc_norm": 0.9612903225806452,
"acc_norm_stderr": 0.010973819726797958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.8078817733990148,
"acc_stderr": 0.02771931570961478,
"acc_norm": 0.8078817733990148,
"acc_norm_stderr": 0.02771931570961478
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466115,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466115
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9212121212121213,
"acc_stderr": 0.021037183825716357,
"acc_norm": 0.9212121212121213,
"acc_norm_stderr": 0.021037183825716357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9646464646464646,
"acc_stderr": 0.01315731887804608,
"acc_norm": 0.9646464646464646,
"acc_norm_stderr": 0.01315731887804608
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084346,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.9128205128205128,
"acc_stderr": 0.014302931207177386,
"acc_norm": 0.9128205128205128,
"acc_norm_stderr": 0.014302931207177386
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.7888888888888889,
"acc_stderr": 0.024882116857655078,
"acc_norm": 0.7888888888888889,
"acc_norm_stderr": 0.024882116857655078
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.9411764705882353,
"acc_stderr": 0.015283995352038426,
"acc_norm": 0.9411764705882353,
"acc_norm_stderr": 0.015283995352038426
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.7682119205298014,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.7682119205298014,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9743119266055046,
"acc_stderr": 0.006782898624451454,
"acc_norm": 0.9743119266055046,
"acc_norm_stderr": 0.006782898624451454
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02541642838876747,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02541642838876747
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9705882352941176,
"acc_stderr": 0.011858507536737417,
"acc_norm": 0.9705882352941176,
"acc_norm_stderr": 0.011858507536737417
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9493670886075949,
"acc_stderr": 0.014271760025370188,
"acc_norm": 0.9493670886075949,
"acc_norm_stderr": 0.014271760025370188
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8923766816143498,
"acc_stderr": 0.020799400082880004,
"acc_norm": 0.8923766816143498,
"acc_norm_stderr": 0.020799400082880004
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9083969465648855,
"acc_stderr": 0.025300035578642962,
"acc_norm": 0.9083969465648855,
"acc_norm_stderr": 0.025300035578642962
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9173553719008265,
"acc_stderr": 0.02513538235660422,
"acc_norm": 0.9173553719008265,
"acc_norm_stderr": 0.02513538235660422
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9629629629629629,
"acc_stderr": 0.018257067489429676,
"acc_norm": 0.9629629629629629,
"acc_norm_stderr": 0.018257067489429676
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9447852760736196,
"acc_stderr": 0.017944712448654636,
"acc_norm": 0.9447852760736196,
"acc_norm_stderr": 0.017944712448654636
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.8392857142857143,
"acc_stderr": 0.034859460964757415,
"acc_norm": 0.8392857142857143,
"acc_norm_stderr": 0.034859460964757415
},
"harness|hendrycksTest-management|5": {
"acc": 0.9611650485436893,
"acc_stderr": 0.019129793517354922,
"acc_norm": 0.9611650485436893,
"acc_norm_stderr": 0.019129793517354922
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9829059829059829,
"acc_stderr": 0.008491806622565604,
"acc_norm": 0.9829059829059829,
"acc_norm_stderr": 0.008491806622565604
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.92,
"acc_stderr": 0.027265992434429086,
"acc_norm": 0.92,
"acc_norm_stderr": 0.027265992434429086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9578544061302682,
"acc_stderr": 0.007184928704935858,
"acc_norm": 0.9578544061302682,
"acc_norm_stderr": 0.007184928704935858
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8872832369942196,
"acc_stderr": 0.017026126074681635,
"acc_norm": 0.8872832369942196,
"acc_norm_stderr": 0.017026126074681635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.9050279329608939,
"acc_stderr": 0.009805284011337068,
"acc_norm": 0.9050279329608939,
"acc_norm_stderr": 0.009805284011337068
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.014476405218161428,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.014476405218161428
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8971061093247589,
"acc_stderr": 0.017255830051445344,
"acc_norm": 0.8971061093247589,
"acc_norm_stderr": 0.017255830051445344
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.9197530864197531,
"acc_stderr": 0.015116405542849367,
"acc_norm": 0.9197530864197531,
"acc_norm_stderr": 0.015116405542849367
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.024414612974307713,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.024414612974307713
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.8305084745762712,
"acc_stderr": 0.009582414456640188,
"acc_norm": 0.8305084745762712,
"acc_norm_stderr": 0.009582414456640188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9411764705882353,
"acc_stderr": 0.014293099746606797,
"acc_norm": 0.9411764705882353,
"acc_norm_stderr": 0.014293099746606797
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.012030208014297142,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.012030208014297142
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8454545454545455,
"acc_stderr": 0.03462262571262667,
"acc_norm": 0.8454545454545455,
"acc_norm_stderr": 0.03462262571262667
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.9061224489795918,
"acc_stderr": 0.018671508543506656,
"acc_norm": 0.9061224489795918,
"acc_norm_stderr": 0.018671508543506656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9751243781094527,
"acc_stderr": 0.011012907274218229,
"acc_norm": 0.9751243781094527,
"acc_norm_stderr": 0.011012907274218229
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.01969463855669321,
"acc_norm": 0.96,
"acc_norm_stderr": 0.01969463855669321
},
"harness|hendrycksTest-virology|5": {
"acc": 0.7168674698795181,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.7168674698795181,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9298245614035088,
"acc_stderr": 0.019591541754525123,
"acc_norm": 0.9298245614035088,
"acc_norm_stderr": 0.019591541754525123
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.5375985523274007,
"mc2_stderr": 0.015202763451961539
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938263
},
"harness|gsm8k|5": {
"acc": 0.535253980288097,
"acc_stderr": 0.013738207990177317
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dongyoung4091/hh-rlhf_with_features | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: helpfulness_chosen
dtype: int64
- name: helpfulness_rejected
dtype: int64
- name: specificity_chosen
dtype: int64
- name: specificity_rejected
dtype: int64
- name: intent_chosen
dtype: int64
- name: intent_rejected
dtype: int64
- name: factuality_chosen
dtype: int64
- name: factuality_rejected
dtype: int64
- name: easy-to-understand_chosen
dtype: int64
- name: easy-to-understand_rejected
dtype: int64
- name: relevance_chosen
dtype: int64
- name: relevance_rejected
dtype: int64
- name: readability_chosen
dtype: int64
- name: readability_rejected
dtype: int64
- name: enough-detail_chosen
dtype: int64
- name: enough-detail_rejected
dtype: int64
- name: biased:_chosen
dtype: int64
- name: biased:_rejected
dtype: int64
- name: fail-to-consider-individual-preferences_chosen
dtype: int64
- name: fail-to-consider-individual-preferences_rejected
dtype: int64
- name: repetetive_chosen
dtype: int64
- name: repetetive_rejected
dtype: int64
- name: fail-to-consider-context_chosen
dtype: int64
- name: fail-to-consider-context_rejected
dtype: int64
- name: too-long_chosen
dtype: int64
- name: too-long_rejected
dtype: int64
- name: human
dtype: string
- name: assistant_chosen
dtype: string
- name: assistant_rejected
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 14281240
num_examples: 9574
- name: test
num_bytes: 14225165
num_examples: 9574
download_size: 15456243
dataset_size: 28506405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AlekseyKorshuk/product-photography-v1-tiny-prompts-tasks-collage-filtered-annotated | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: category
dtype: string
- name: product
dtype: string
- name: prompt
dtype: string
- name: image_mask
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6050006679.820039
num_examples: 3606
download_size: 6052149820
dataset_size: 6050006679.820039
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chenqile09/llama2-chinese-couplet | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: data
dtype: string
splits:
- name: train
num_bytes: 211969430
num_examples: 770491
- name: validation
num_bytes: 1101256
num_examples: 4000
download_size: 56353998
dataset_size: 213070686
---
# Dataset Card for "chenqile09-chinese-couplet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_205 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1140541804
num_examples: 223987
download_size: 1164556035
dataset_size: 1140541804
---
# Dataset Card for "chunk_205"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Safurai/Code-Instruct-700k | ---
dataset_info:
features:
- name: prompt
dtype: large_string
- name: main_topic
dtype: large_string
- name: subtopic
dtype: large_string
- name: adjective
dtype: large_string
- name: action_verb
dtype: large_string
- name: scenario
dtype: large_string
- name: target_audience
dtype: large_string
- name: programming_language
dtype: large_string
- name: common_sense_topic
dtype: large_string
- name: idx
dtype: int64
- name: response
dtype: large_string
splits:
- name: train
num_bytes: 1657193365
num_examples: 700000
download_size: 705514514
dataset_size: 1657193365
---
# Dataset Card for "Code-Instruct-700k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dariolopez/oasst1-es | ---
license: apache-2.0
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int32
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: int32
- name: synthetic
dtype: bool
- name: model_name
dtype: string
- name: detoxify
struct:
- name: toxicity
dtype: float64
- name: severe_toxicity
dtype: float64
- name: obscene
dtype: float64
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: threat
dtype: float64
- name: sexual_explicit
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
sequence:
- name: name
dtype: string
- name: count
dtype: int32
- name: labels
sequence:
- name: name
dtype: string
- name: value
dtype: float64
- name: count
dtype: int32
splits:
- name: train
num_bytes: 27090656.151000153
num_examples: 22763
- name: validation
num_bytes: 1445812.4526244036
num_examples: 1212
download_size: 615518
dataset_size: 28536468.603624556
language:
- es
size_categories:
- 10K<n<100K
---
# OpenAssistant Conversations Spanish Dataset (OASST1-es)
Subset of the original [OpenAssistant Conversations Dataset (OASST)](https://huggingface.co/datasets/OpenAssistant/oasst1) filtered by `lang=es`.
|
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e4 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e4](https://huggingface.co/BFauber/lora_llama2-13b_10e4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T02:05:04.594097](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e4/blob/main/results_2024-02-10T02-05-04.594097.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.553196767770073,\n\
\ \"acc_stderr\": 0.03363143219504891,\n \"acc_norm\": 0.559134719077929,\n\
\ \"acc_norm_stderr\": 0.03435308822937811,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.3852316665112248,\n\
\ \"mc2_stderr\": 0.013857314176435714\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946704\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6165106552479586,\n\
\ \"acc_stderr\": 0.004852420856631472,\n \"acc_norm\": 0.8245369448317068,\n\
\ \"acc_norm_stderr\": 0.0037958533012440038\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874141,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874141\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6838709677419355,\n \"acc_stderr\": 0.026450874489042774,\n \"\
acc_norm\": 0.6838709677419355,\n \"acc_norm_stderr\": 0.026450874489042774\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186068,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186068\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960415,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960415\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046735,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046735\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404032,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404032\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2905027932960894,\n\
\ \"acc_stderr\": 0.015183844307206143,\n \"acc_norm\": 0.2905027932960894,\n\
\ \"acc_norm_stderr\": 0.015183844307206143\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110307,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037106,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037106\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42046936114732725,\n\
\ \"acc_stderr\": 0.012607654553832707,\n \"acc_norm\": 0.42046936114732725,\n\
\ \"acc_norm_stderr\": 0.012607654553832707\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.02007942040808792,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.02007942040808792\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573023,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573023\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.3852316665112248,\n\
\ \"mc2_stderr\": 0.013857314176435714\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483667\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2312357846853677,\n \
\ \"acc_stderr\": 0.01161358750316659\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-05-04.594097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-05-04.594097.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- '**/details_harness|winogrande|5_2024-02-10T02-05-04.594097.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T02-05-04.594097.parquet'
- config_name: results
data_files:
- split: 2024_02_10T02_05_04.594097
path:
- results_2024-02-10T02-05-04.594097.parquet
- split: latest
path:
- results_2024-02-10T02-05-04.594097.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e4](https://huggingface.co/BFauber/lora_llama2-13b_10e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T02:05:04.594097](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e4/blob/main/results_2024-02-10T02-05-04.594097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.553196767770073,
"acc_stderr": 0.03363143219504891,
"acc_norm": 0.559134719077929,
"acc_norm_stderr": 0.03435308822937811,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.3852316665112248,
"mc2_stderr": 0.013857314176435714
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946704
},
"harness|hellaswag|10": {
"acc": 0.6165106552479586,
"acc_stderr": 0.004852420856631472,
"acc_norm": 0.8245369448317068,
"acc_norm_stderr": 0.0037958533012440038
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874141,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874141
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365242,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365242
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042774,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042774
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.02869787397186068,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.02869787397186068
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960415,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960415
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046735,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046735
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404032,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404032
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2905027932960894,
"acc_stderr": 0.015183844307206143,
"acc_norm": 0.2905027932960894,
"acc_norm_stderr": 0.015183844307206143
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110307,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.02731684767419271,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.02731684767419271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037106,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037106
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970473,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42046936114732725,
"acc_stderr": 0.012607654553832707,
"acc_norm": 0.42046936114732725,
"acc_norm_stderr": 0.012607654553832707
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.02007942040808792,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.02007942040808792
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573023,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573023
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.3852316665112248,
"mc2_stderr": 0.013857314176435714
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483667
},
"harness|gsm8k|5": {
"acc": 0.2312357846853677,
"acc_stderr": 0.01161358750316659
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
neelblabla/enron_labeled_email-llama2-7b_finetuning | ---
task_categories:
- text-classification
language:
- en
pretty_name: enron_labeled_prompts
size_categories:
- 1K<n<10K
--- |
irds/beir_fiqa | ---
pretty_name: '`beir/fiqa`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/fiqa`
The `beir/fiqa` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/fiqa).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=57,638
- `queries` (i.e., topics); count=6,648
This dataset is used by: [`beir_fiqa_dev`](https://huggingface.co/datasets/irds/beir_fiqa_dev), [`beir_fiqa_test`](https://huggingface.co/datasets/irds/beir_fiqa_test), [`beir_fiqa_train`](https://huggingface.co/datasets/irds/beir_fiqa_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_fiqa', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
queries = load_dataset('irds/beir_fiqa', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Maia2018Fiqa,
title={WWW'18 Open Challenge: Financial Opinion Mining and Question Answering},
author={Macedo Maia and S. Handschuh and A. Freitas and Brian Davis and R. McDermott and M. Zarrouk and A. Balahur},
journal={Companion Proceedings of the The Web Conference 2018},
year={2018}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
AdapterOcean/med_alpaca_standardized_cluster_88_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8123207
num_examples: 5946
download_size: 4196884
dataset_size: 8123207
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_88_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atipico1/nq-test-format | ---
dataset_info:
features:
- name: question
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entities
dtype: string
- name: entities_count
dtype: int64
- name: adv_sent
dtype: string
- name: adv_passage
dtype: string
- name: cos_sim
dtype: float64
- name: answer_match
dtype: bool
- name: is_valid_adversary
dtype: bool
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 70514130
num_examples: 3610
download_size: 41134068
dataset_size: 70514130
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
somosnlp/wikihow_es_80train_20test | ---
dataset_info:
features:
- name: title
dtype: string
- name: section_name
dtype: string
- name: summary
dtype: string
- name: document
dtype: string
- name: english_section_name
dtype: string
- name: english_url
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 258645759
num_examples: 90528
- name: test
num_bytes: 64819387
num_examples: 22632
download_size: 185679223
dataset_size: 323465146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
toilaluan/reward_tuned_prompt_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: model_type
dtype: string
- name: request_id
dtype: int64
- name: topic
dtype: string
- name: reward
dtype: float64
- name: individual_rewards
struct:
- name: clip_aesthetic_rewarder
dtype: float64
- name: pick_rewarder
dtype: float64
- name: image_rewarder
dtype: float64
- name: hps_v2_rewarder
dtype: float64
splits:
- name: train
num_bytes: 463200
num_examples: 4500
download_size: 160515
dataset_size: 463200
---
# Dataset Card for "reward_tuned_prompt_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/local_market_vendor_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2255830
num_examples: 10000
download_size: 184916
dataset_size: 2255830
---
# Dataset Card for "local_market_vendor_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CognitiveLab/hh-rlhf-formatted | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 327229899
num_examples: 160800
download_size: 176226483
dataset_size: 327229899
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-ac4402f5-7985074 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- beans
eval_info:
task: image_multi_class_classification
model: mrm8488/convnext-tiny-finetuned-beans
metrics: []
dataset_name: beans
dataset_config: default
dataset_split: test
col_mapping:
image: image
target: labels
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Image Classification
* Model: mrm8488/convnext-tiny-finetuned-beans
* Dataset: beans
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
ddrg/super_eurlex | ---
annotations_creators:
- found
language:
- bg
- cs
- da
- de
- el
- en
- es
- et
- fi
- fr
- ga
- hr
- hu
- it
- lt
- lv
- mt
- nl
- pl
- pt
- ro
- sk
- sl
- sv
language_creators:
- found
license:
- mit
multilinguality:
- multilingual
size_categories:
- 1M<n<10M
source_datasets:
- original
tags:
- legal documents
- corpus
- eurlex
- html
task_categories:
- text-classification
- fill-mask
task_ids:
- multi-class-classification
- multi-label-classification
pretty_name: 'SuperEURLEX: A Corpus of Plain Text and HTML from EURLEX, Annotated for multiple Legal Domain Text Classification Tasks.'
---
# Dataset Card for SuperEURLEX
This dataset contains over 4.6M Legal Documents from EURLEX with Annotations.
Over 3.7M of this 4.6M documents are also available in HTML format.
This dataset can be used for pretraining language models as well as for testing them on legal text classification tasks.
Use this dataset as follows:
```python
from datasets import load_dataset
config = "0.DE" # {sector}.{lang}[.html]
dataset = load_dataset("ddrg/super_eurlex", config, split='train')
```
## Dataset Details
### Dataset Description
This Dataset was scrapped from [EURLEX](https://eur-lex.europa.eu/homepage.html).
It contains more than 4.6M Legal Documents in Plain Text and over 3.7M In HTML Format.
Those Documents are separated by their language (This Dataset includes a total of 24 official European Languages)
and by their Sector.
#### The Table below shows the number of documents per language:
| | Raw | HTML |
|---:|--------:|--------:|
| BG | 29,778 | 27,718 |
| CS | 94,439 | 91,754 |
| DA | 398,559 | 300,488 |
| DE | 384,179 | 265,724 |
| EL | 167,502 | 117,009 |
| EN | 456,212 | 354,186 |
| ES | 253,821 | 201,400 |
| ET | 142,183 | 139,690 |
| FI | 238,143 | 214,206 |
| FR | 427,011 | 305,592 |
| GA | 19,673 | 19,437 |
| HR | 37,200 | 35,944 |
| HU | 69,275 | 66,334 |
| IT | 358,637 | 259,936 |
| LT | 62,975 | 61,139 |
| LV | 105,433 | 102,105 |
| MT | 46,695 | 43,969 |
| NL | 345,276 | 237,366 |
| PL | 146,502 | 143,490 |
| PT | 369,571 | 314,148 |
| RO | 47,398 | 45,317 |
| SK | 100,718 | 98,192 |
| SL | 170,583 | 166,646 |
| SV | 172,926 | 148,656 |
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
- **Repository:** https://huggingface.co/datasets/ddrg/super_eurlex/tree/main
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
### As Corpus for:
- **Pretraining of Language Models with self supervised tasks** like Masked Language Modeling and Next Sentence Prediction
- Legal Text Analysis
### As Dataset for evaluation on the following task:
- *eurovoc*-Concepts Prediction i.e. which tags apply? (Muli-Label Classification (large Scale))
- Example for this task is given[below
- *subject-matter* Prediction i.e. which other tags apply (Multi-Label Classification)
- *form* Classification i.e. What Kind of Document is it? (Multi-Class)
- And more
### Example for Use Of EUROVOC-Concepts
```python
from datasets import load_dataset
import transformers as tr
from sklearn.preprocessing import MultiLabelBinarizer
import numpy as np
import evaluate
import uuid
# ==================== #
# Prepare Data #
# ==================== #
CONFIG = "3.EN" # {sector}.{lang}[.html]
MODEL_NAME = "distilroberta-base"
dataset = load_dataset("ddrg/super_eurlex", CONFIG, split='train')
tokenizer = tr.AutoTokenizer.from_pretrained(MODEL_NAME)
# Remove Unlabeled Columns
def remove_nulls(batch):
return [(sample != None) for sample in batch["eurovoc"]]
dataset = dataset.filter(remove_nulls, batched=True, keep_in_memory=True)
# Tokenize Text
def tokenize(batch):
return tokenizer(batch["text_cleaned"], truncation=True, padding="max_length")
# Keep in Memory is optional (The Dataset is large though and can easily use up alot of memory)
dataset = dataset.map(tokenize, batched=True, keep_in_memory=True)
# Create Label Column by encoding Eurovoc Concepts
encoder = MultiLabelBinarizer()
# List of all Possible Labels
eurovoc_concepts = dataset["eurovoc"]
encoder.fit(eurovoc_concepts)
def encode_labels(batch):
batch["label"] = encoder.transform(batch["eurovoc"])
return batch
dataset = dataset.map(encode_labels, batched=True, keep_in_memory=True)
# Split into train and Test set
dataset = dataset.train_test_split(0.2)
# ==================== #
# Load & Train Model #
# ==================== #
model = tr.AutoModelForSequenceClassification.from_pretrained(
MODEL_NAME,
num_labels=len(encoder.classes_),
problem_type="multi_label_classification",
)
metric = evaluate.load("JP-SystemsX/nDCG", experiment_id=uuid.uuid4())
def compute_metric(eval_pred):
predictions, labels = eval_pred
return metric.compute(predictions=predictions, references=labels, k=5)
# Set Hyperparameter
# Note: We stay mostly with default values to keep example short
# Though more hyperparameter should be set and tuned in praxis
train_args = tr.TrainingArguments(
output_dir="./cache",
per_device_train_batch_size=16,
num_train_epochs=20
)
trainer = tr.Trainer(
model=model,
args=train_args,
train_dataset=dataset["train"],
compute_metrics=compute_metric,
)
trainer.train() # This will take a while
print(trainer.evaluate(dataset["test"]))
# >>> {'eval_loss': 0.0018887673504650593, 'eval_nDCG@5': 0.8072531683578489, 'eval_runtime': 663.8582, 'eval_samples_per_second': 32.373, 'eval_steps_per_second': 4.048, 'epoch': 20.0}
```
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
This dataset is divided into multiple split by _Sector x Language x Format_
Sector refers to the kind of Document it belongs to:
- **0:** Consolidated acts
- **1:** Treaties
- **2:** International agreements
- **3:** Legislation
- **4:** Complementary legislation
- **5:** Preparatory acts and working documents
- **6:** Case-law
- **7:** National transposition measures
- **8:** References to national case-law concerning EU law
- **9:** Parliamentary questions
- **C:** Other documents published in the Official Journal C series
- **E:** EFTA documents
Language refers to each of the 24 official European Languages that were included at the date of the dataset creation:
- BG ~ Bulgarian
- CS ~ Czech
- DA ~ Danish
- DE ~ German
- EL ~ Greek
- EN ~ English
- ES ~ Spanish
- ET ~ Estonian
- FI ~ Finnish
- FR ~ French
- GA ~ Irish
- HR ~ Croatian
- HU ~ Hungarian
- IT ~ Italian
- LT ~ Lithuanian
- LV ~ Latvian
- MT ~ Maltese
- NL ~ Dutch
- PL ~ Polish
- PT ~ Portuguese
- RO ~ Romanian
- SK ~ Slovak
- SL ~ Slovenian
- SV ~ Swedish
Format refers to plain Text (default) or HTML format (.html)
> Note: Plain Text contains generally more documents because not all documents were available in HTML format but those that were are included in both formats
Those Splits are named the following way:
`{sector}.{lang}[.html]`
For Example:
- `3.EN` would be English legislative documents in plain text format
- `3.EN.html` would be the same in HTML Format
Each _Sector_ has its own set of meta data:
<details><summary>Sector 0 (Consolidated acts)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
</p>
</details>
<details><summary>Sector 1 (Treaties)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _current_consolidated_version_ ~ date when this version of the document was consolidated `Format DD/MM/YYYY`
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
</p>
</details>
<details><summary>Sector 2 (International agreements)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
- _latest_consolidated_version_ ~ `Format DD/MM/YYYY`
- _current_consolidated_version_ ~ `Format DD/MM/YYYY`
</p>
</details>
<details><summary>Sector 3 (Legislation)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
- _latest_consolidated_version_ ~ `Format DD/MM/YYYY`
- _current_consolidated_version_ ~ `Format DD/MM/YYYY`
</p>
</details>
<details><summary>Sector 4 (Complementary legislation)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
- _latest_consolidated_version_ ~ `Format DD/MM/YYYY`
- _current_consolidated_version_ ~ `Format DD/MM/YYYY`
</p>
</details>
<details><summary>Sector 5 (Preparatory acts and working documents)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
- _latest_consolidated_version_ ~ `Format DD/MM/YYYY`
</p>
</details>
<details><summary>Sector 6 (Case-law)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
- _case-law_directory_code_before_lisbon_ ~ Classification system used for case law before Treaty of Lisbon came into effect (2009), each code reflects a particular area of EU law
</p>
</details>
<details><summary>Sector 7 (National transposition measures)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _transposed_legal_acts_ ~ national laws that exist in EU member states as a direct result of the need to comply with EU directives
</p>
</details>
<details><summary>Sector 8 (References to national case-law concerning EU law)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _case-law_directory_code_before_lisbon_ ~ Classification system used for case law before Treaty of Lisbon came into effect (2009), each code reflects a particular area of EU law
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
</p>
</details>
<details><summary>Sector 9 (Parliamentary questions)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
</p>
</details>
<details><summary>Sector C (Other documents published in the Official Journal C series)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
</p>
</details>
<details><summary>Sector E (EFTA documents)</summary><p>
- _celex_id_ ~ Unique Identifier for each document
- _text_cleaned_ (Plain Text) **or** _text_html_raw_ (HTML Format)
- _form_ ~ Kind of Document e.g. Consolidated text, or Treaty
- _directory_code_ ~ Information to structure documents in some kind of directory structure by topic e.g. `'03.50.30.00 Agriculture / Approximation of laws and health measures / Animal health and zootechnics'`
- _subject_matter_ ~ Keywords that provide general overview of content in a document see [here](https://eur-lex.europa.eu/content/e-learning/browsing_options.html) for more information
- _eurovoc_ ~ Keywords that describe document content based on the European Vocabulary see [here](https://eur-lex.europa.eu/browse/eurovoc.html) for more information
</p>
</details>
## Dataset Creation
### Curation Rationale
This dataset was created for the creation and/or evaluation of pretrained Legal Language Models.
### Source Data
#### Data Collection and Processing
We used the [EURLEX-Web-Scrapper Repo](https://github.com/JP-SystemsX/Eurlex-Web-Scrapper) for the data collection process.
#### Who are the source data producers?
The Source data stems from the [EURLEX-Website](https://eur-lex.europa.eu/) and was therefore produced by various entities within the European Union
#### Personal and Sensitive Information
No Personal or Sensitive Information is included to the best of our knowledge.
## Bias, Risks, and Limitations
- We removed HTML documents from which we couldn't extract plain text under the assumption that those are **corrupted files**.
However, we can't guarantee that we removed all.
- The Extraction of plain text from legal HTML documents can lead to **formatting issues**
e.g. the extraction of text from tables might mix up the order such that it becomes nearly incomprehensible.
- This dataset might contain many **missing values** in the meta-data columns as not every document was annotated in the same way
[More Information Needed]
### Recommendations
- Consider Removing rows with missing values for the task before training a model on it
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LawInformedAI/claudette_tos | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1834600
num_examples: 9414
download_size: 0
dataset_size: 1834600
---
# Dataset Card for "claudette_tos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malucoelhaofc/KitConnorV2 | ---
license: openrail
---
|
Locutusque/TM-DATA | ---
license: apache-2.0
task_categories:
- text-generation
- text-classification
- text2text-generation
- fill-mask
language:
- en
size_categories:
- 1M<n<10M
tags:
- not-for-all-audiences
---
Dataset used to train TinyMistral-248m-v2. Consists of around 8 million examples.
Consists of the following sources:
- 4 million Wikipedia pages
- 1 million arxiv papers
- 1.5 million web pages sourced from RefinedWeb and SlimPajama
- 200,000 college text books
- 1 million stack exchange forum posts.
This dataset can contain NSFW examples, use at your own risk. |
open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3 | ---
pretty_name: Evaluation run of flemmingmiguel/MBX-7B-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [flemmingmiguel/MBX-7B-v3](https://huggingface.co/flemmingmiguel/MBX-7B-v3) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-29T00:10:52.670556](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3/blob/main/results_2024-01-29T00-10-52.670556.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6568226997496719,\n\
\ \"acc_stderr\": 0.03194203481467334,\n \"acc_norm\": 0.6561689082057574,\n\
\ \"acc_norm_stderr\": 0.03261255172047086,\n \"mc1\": 0.5789473684210527,\n\
\ \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7186768399576933,\n\
\ \"mc2_stderr\": 0.014757394057634371\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.01328452529240351,\n\
\ \"acc_norm\": 0.7414675767918089,\n \"acc_norm_stderr\": 0.012794553754288694\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7161919936267676,\n\
\ \"acc_stderr\": 0.004499233874427508,\n \"acc_norm\": 0.8890659231228839,\n\
\ \"acc_norm_stderr\": 0.003134086549952684\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"\
acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n\
\ \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7186768399576933,\n\
\ \"mc2_stderr\": 0.014757394057634371\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8555643251775849,\n \"acc_stderr\": 0.009879767358079229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624184\n }\n}\n```"
repo_url: https://huggingface.co/flemmingmiguel/MBX-7B-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|arc:challenge|25_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|arc:challenge|25_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|arc:challenge|25_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|arc:challenge|25_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|gsm8k|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|gsm8k|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|gsm8k|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|gsm8k|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hellaswag|10_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hellaswag|10_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hellaswag|10_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hellaswag|10_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T00-11-56.066743.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T06-10-31.066280.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-32-50.245516.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-29T00-10-52.670556.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- '**/details_harness|winogrande|5_2024-01-24T00-11-56.066743.parquet'
- split: 2024_01_25T06_10_31.066280
path:
- '**/details_harness|winogrande|5_2024-01-25T06-10-31.066280.parquet'
- split: 2024_01_26T02_32_50.245516
path:
- '**/details_harness|winogrande|5_2024-01-26T02-32-50.245516.parquet'
- split: 2024_01_29T00_10_52.670556
path:
- '**/details_harness|winogrande|5_2024-01-29T00-10-52.670556.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-29T00-10-52.670556.parquet'
- config_name: results
data_files:
- split: 2024_01_24T00_11_56.066743
path:
- results_2024-01-24T00-11-56.066743.parquet
- split: 2024_01_25T06_10_31.066280
path:
- results_2024-01-25T06-10-31.066280.parquet
- split: 2024_01_26T02_32_50.245516
path:
- results_2024-01-26T02-32-50.245516.parquet
- split: 2024_01_29T00_10_52.670556
path:
- results_2024-01-29T00-10-52.670556.parquet
- split: latest
path:
- results_2024-01-29T00-10-52.670556.parquet
---
# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/MBX-7B-v3](https://huggingface.co/flemmingmiguel/MBX-7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T00:10:52.670556](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3/blob/main/results_2024-01-29T00-10-52.670556.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6568226997496719,
"acc_stderr": 0.03194203481467334,
"acc_norm": 0.6561689082057574,
"acc_norm_stderr": 0.03261255172047086,
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7186768399576933,
"mc2_stderr": 0.014757394057634371
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.01328452529240351,
"acc_norm": 0.7414675767918089,
"acc_norm_stderr": 0.012794553754288694
},
"harness|hellaswag|10": {
"acc": 0.7161919936267676,
"acc_stderr": 0.004499233874427508,
"acc_norm": 0.8890659231228839,
"acc_norm_stderr": 0.003134086549952684
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7186768399576933,
"mc2_stderr": 0.014757394057634371
},
"harness|winogrande|5": {
"acc": 0.8555643251775849,
"acc_stderr": 0.009879767358079229
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dylanmontoya22/bioBERT-ner-biomedical-text | ---
dataset_info:
features:
- name: text
dtype: string
- name: tokens
sequence: string
- name: annotation
list:
- name: end
dtype: int64
- name: label
dtype: string
- name: start
dtype: int64
splits:
- name: train
num_bytes: 148595
num_examples: 710
download_size: 24684
dataset_size: 148595
---
# Dataset Card for "bioBERT-ner-biomedical-text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JinglesDados/Peri | ---
license: openrail
---
|
BangumiBase/seraphoftheend | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Seraph Of The End
This is the image base of bangumi Seraph of the End, we detected 51 characters, 3456 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 238 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 32 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 191 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 106 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 152 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 41 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 35 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 41 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 75 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 14 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 13 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 16 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 36 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 702 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 24 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 173 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 61 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 12 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 10 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 227 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 90 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 67 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 28 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 64 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 12 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 18 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 353 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 27 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 21 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 14 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 94 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 8 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 13 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 7 | [Download](34/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 35 | 15 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 7 | [Download](36/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 37 | 17 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 8 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 6 | [Download](39/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 40 | 45 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 10 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 31 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 13 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 28 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 36 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 16 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 6 | [Download](47/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 48 | 17 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 20 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 146 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
bdsaglam/musique-answerable-2hop-subset-jerx-reward-2023-12-31T14-20-25 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: reward
dtype: int64
splits:
- name: train
num_bytes: 1306097
num_examples: 900
download_size: 89215
dataset_size: 1306097
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaleemWaheed/twitter_dataset_1713146073 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25314
num_examples: 56
download_size: 12998
dataset_size: 25314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hqfx/tulu_v2 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: science.scierc_ner
num_bytes: 634623
num_examples: 349
- name: sharegpt
num_bytes: 826685526
num_examples: 74312
- name: science.scifact_json
num_bytes: 2350372
num_examples: 919
- name: lima
num_bytes: 2847761
num_examples: 1018
- name: gpt4_alpaca
num_bytes: 16166613
num_examples: 19906
- name: science.evidence_inference
num_bytes: 6623765
num_examples: 1674
- name: oasst1
num_bytes: 11497333
num_examples: 7331
- name: science.scitldr_aic
num_bytes: 13404075
num_examples: 1959
- name: science.scierc_relation
num_bytes: 735295
num_examples: 349
- name: science.qasper_truncated_4000
num_bytes: 35171822
num_examples: 2218
- name: hard_coded
num_bytes: 71030
num_examples: 140
- name: code_alpaca
num_bytes: 7112342
num_examples: 20016
- name: cot
num_bytes: 56130437
num_examples: 49747
- name: wizardlm
num_bytes: 69904382
num_examples: 29810
- name: open_orca
num_bytes: 53121184
num_examples: 29683
- name: flan_v2
num_bytes: 105696294
num_examples: 49123
download_size: 541137401
dataset_size: 1208152854
configs:
- config_name: default
data_files:
- split: science.scierc_ner
path: data/science.scierc_ner-*
- split: sharegpt
path: data/sharegpt-*
- split: science.scifact_json
path: data/science.scifact_json-*
- split: lima
path: data/lima-*
- split: gpt4_alpaca
path: data/gpt4_alpaca-*
- split: science.evidence_inference
path: data/science.evidence_inference-*
- split: oasst1
path: data/oasst1-*
- split: science.scitldr_aic
path: data/science.scitldr_aic-*
- split: science.scierc_relation
path: data/science.scierc_relation-*
- split: science.qasper_truncated_4000
path: data/science.qasper_truncated_4000-*
- split: hard_coded
path: data/hard_coded-*
- split: code_alpaca
path: data/code_alpaca-*
- split: cot
path: data/cot-*
- split: wizardlm
path: data/wizardlm-*
- split: open_orca
path: data/open_orca-*
- split: flan_v2
path: data/flan_v2-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.