datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
larrygzlb1994/HolidayRePaint000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
splits:
- name: train
num_bytes: 1057140898.176
num_examples: 8266
download_size: 1060940822
dataset_size: 1057140898.176
---
# Dataset Card for "HolidayRePaint000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maknee/ggml-vicuna-v0-quantized | ---
license: apache-2.0
tags:
- vicuna
- vicuna-v0
- ggml
pipeline_tag: conversational
language:
- en
- bg
- ca
- cs
- da
- de
- es
- fr
- hr
- hu
- it
- nl
- pl
- pt
- ro
- ru
- sl
- sr
- sv
- uk
library_name: adapter-transformers
---
These are quantized ggml binary files for vicuna 7B and 13B models. The version of vicuna for these models are v0.
These files can be used in conjunction with minigpt4 ggml models [7B](https://huggingface.co/datasets/maknee/minigpt4-7b-ggml) and [13B](https://huggingface.co/datasets/maknee/minigpt4-13b-ggml) in [minigpt4.cpp](https://github.com/Maknee/minigpt4.cpp)
Recommended are the Q5_K and Q6_K implementations. If there are any issues, use Q4_1 or Q4_0.
---
# Vicuna Model Card
## Model details
**Model type:**
Vicuna is an open-source chatbot trained by fine-tuning LLaMA on user-shared conversations collected from ShareGPT.
It is an auto-regressive language model, based on the transformer architecture.
**Model date:**
Vicuna was trained between March 2023 and April 2023.
**Organizations developing the model:**
The Vicuna team with members from UC Berkeley, CMU, Stanford, and UC San Diego.
**Paper or resources for more information:**
https://vicuna.lmsys.org/
**License:**
Apache License 2.0
**Where to send questions or comments about the model:**
https://github.com/lm-sys/FastChat/issues
## Intended use
**Primary intended uses:**
The primary use of Vicuna is research on large language models and chatbots.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in natural language processing, machine learning, and artificial intelligence.
## Training dataset
70K conversations collected from ShareGPT.com.
(48k for the uncensored variant. 22k worth of garbage removed – see https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered)
## Evaluation dataset
A preliminary evaluation of the model quality is conducted by creating a set of 80 diverse questions and utilizing GPT-4 to judge the model outputs. See https://vicuna.lmsys.org/ for more details.
|
anan-2024/twitter_dataset_1713020593 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 106531
num_examples: 274
download_size: 61071
dataset_size: 106531
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jimregan/nst_swedish_tts | ---
license: cc0-1.0
task_categories:
- text-to-speech
language:
- sv
---
# Dataset Card for NST Swedish Speech Synthesis (44 kHz)
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [NST Swedish Speech Synthesis (44 kHz)](https://www.nb.no/sprakbanken/en/resource-catalogue/oai-nb-no-sbr-18/)
### Dataset Summary
The corpus consists of a single speaker, with 5277 segments.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
The audio is in Swedish.
## Dataset Structure
[Needs More Information]
### Data Instances
[Needs More Information]
### Data Fields
[Needs More Information]
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
(The below is a partially corrected machine translation from [here](https://www.nb.no/sbfil/dok/nst_taledat_se.pdf) )
The data was developed by Nordisk språkteknologi holding AS (NST),
which went bankrupt in 2003.
In 2006, a jointly owned group of the University of
Oslo, the University of Bergen, the Norwegian University of Science and Technology, the Language Council and IBM
bought the assets of NST, to ensure that the linguistic resources that NST had developed were
take care of. The National Library was commissioned by the Ministry of Culture to build a Norwegian
language bank in 2009, and started this work in 2010.
The resources after NST were transferred to the National Library in May 2011, and they are now done
available in the Language Bank, initially without further processing.
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[The Norwegian Language Bank](https://www.nb.no/sprakbanken/en/sprakbanken/)
### Licensing Information
[CC0: Public Domain](https://creativecommons.org/publicdomain/zero/1.0/)
### Citation Information
[Needs More Information]
### Contributions
[Needs More Information] |
ylacombe/example_process_dataset | ---
dataset_info:
config_name: midlands_male
features:
- name: line_id
dtype: string
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: float64
- name: phonemes
dtype: string
splits:
- name: train
num_bytes: 107980
num_examples: 450
download_size: 63769
dataset_size: 107980
configs:
- config_name: midlands_male
data_files:
- split: train
path: midlands_male/train-*
---
|
jayasuryajsk/Yuin | ---
license: apache-2.0
---
|
CyberHarem/fumizuki_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fumizuki/文月/文月 (Kantai Collection)
This is the dataset of fumizuki/文月/文月 (Kantai Collection), containing 373 images and their tags.
The core tags of this character are `brown_hair, long_hair, ponytail, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 373 | 261.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fumizuki_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 373 | 186.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fumizuki_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 779 | 378.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fumizuki_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 373 | 246.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fumizuki_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 779 | 476.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fumizuki_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fumizuki_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, black_serafuku, black_skirt, blush, crescent_pin, open_mouth, simple_background, solo, twitter_username, white_necktie, hair_between_eyes, long_sleeves, white_background, black_sailor_collar, black_shirt, looking_at_viewer, neckerchief, one-hour_drawing_challenge, smile, black_socks, kneehighs, pleated_skirt |
| 1 | 6 |  |  |  |  |  | 1girl, black_sailor_collar, black_serafuku, black_shirt, black_skirt, crescent_pin, long_sleeves, pleated_skirt, simple_background, solo, white_necktie, cowboy_shot, looking_at_viewer, white_background, white_neckerchief, hair_between_eyes, open_mouth, smile, blush |
| 2 | 23 |  |  |  |  |  | 1girl, black_sailor_collar, long_sleeves, serafuku, solo, yellow_neckerchief, white_shirt, blue_jacket, crescent_pin, simple_background, smile, blush, open_mouth, white_background, looking_at_viewer, black_skirt, hair_between_eyes, yellow_necktie, upper_body |
| 3 | 6 |  |  |  |  |  | 1girl, solo, looking_at_viewer, navel, open_mouth, blush, white_bikini, armpits, crescent, flat_chest, one-hour_drawing_challenge, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_serafuku | black_skirt | blush | crescent_pin | open_mouth | simple_background | solo | twitter_username | white_necktie | hair_between_eyes | long_sleeves | white_background | black_sailor_collar | black_shirt | looking_at_viewer | neckerchief | one-hour_drawing_challenge | smile | black_socks | kneehighs | pleated_skirt | cowboy_shot | white_neckerchief | serafuku | yellow_neckerchief | white_shirt | blue_jacket | yellow_necktie | upper_body | navel | white_bikini | armpits | crescent | flat_chest |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:--------|:---------------|:-------------|:--------------------|:-------|:-------------------|:----------------|:--------------------|:---------------|:-------------------|:----------------------|:--------------|:--------------------|:--------------|:-----------------------------|:--------|:--------------|:------------|:----------------|:--------------|:--------------------|:-----------|:---------------------|:--------------|:--------------|:-----------------|:-------------|:--------|:---------------|:----------|:-----------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | X | | | X | X | X | | | | | | | | | | | |
| 2 | 23 |  |  |  |  |  | X | | X | X | X | X | X | X | | | X | X | X | X | | X | | | X | | | | | | X | X | X | X | X | X | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | X | | X | X | X | | | | | X | | | X | | X | | | | | | | | | | | | | X | X | X | X | X |
|
distilled-from-one-sec-cv12/chunk_66 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1169675176
num_examples: 227918
download_size: 1192107166
dataset_size: 1169675176
---
# Dataset Card for "chunk_66"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/pythia-memorized-evals | ---
dataset_info:
features:
- name: index
dtype: int64
- name: tokens
sequence: int64
- name: __index_level_0__
dtype: int64
splits:
- name: duped.1.4b
num_bytes: 730820104
num_examples: 1373722
- name: deduped.1.4b
num_bytes: 557587604
num_examples: 1048097
- name: duped.160m
num_bytes: 366906036
num_examples: 689673
- name: deduped.160m
num_bytes: 309195740
num_examples: 581195
- name: duped.12b
num_bytes: 1267397432
num_examples: 2382326
- name: deduped.12b
num_bytes: 995486380
num_examples: 1871215
- name: duped.70m
num_bytes: 246822996
num_examples: 463953
- name: deduped.70m
num_bytes: 218890336
num_examples: 411448
- name: duped.2.8b
num_bytes: 891140964
num_examples: 1675077
- name: deduped.2.8b
num_bytes: 720972252
num_examples: 1355211
- name: duped.410m
num_bytes: 516221412
num_examples: 970341
- name: deduped.410m
num_bytes: 431472748
num_examples: 811039
- name: duped.6.9b
num_bytes: 1128355508
num_examples: 2120969
- name: deduped.6.9b
num_bytes: 893916408
num_examples: 1680294
- name: duped.1b
num_bytes: 668267012
num_examples: 1256141
- name: deduped.1b
num_bytes: 549484180
num_examples: 1032865
- name: duped.12b.23000
num_bytes: 105429100
num_examples: 198175
- name: duped.12b.43000
num_bytes: 235278596
num_examples: 442253
- name: duped.12b.63000
num_bytes: 385528696
num_examples: 724678
- name: duped.12b.83000
num_bytes: 568442532
num_examples: 1068501
- name: duped.12b.103000
num_bytes: 803564188
num_examples: 1510459
- name: duped.12b.123000
num_bytes: 1061877852
num_examples: 1996011
- name: deduped.12b.23000
num_bytes: 86938376
num_examples: 163418
- name: deduped.12b.43000
num_bytes: 190915116
num_examples: 358863
- name: deduped.12b.63000
num_bytes: 311255644
num_examples: 585067
- name: deduped.12b.83000
num_bytes: 453300176
num_examples: 852068
- name: deduped.12b.103000
num_bytes: 636047496
num_examples: 1195578
- name: deduped.12b.123000
num_bytes: 832077260
num_examples: 1564055
- name: deduped.1b.new
num_bytes: 549484180
num_examples: 1032865
download_size: 4735823411
dataset_size: 16713076324
configs:
- config_name: default
data_files:
- split: duped.12b.23000
path: data/duped.12b.23000-*
- split: duped.12b.43000
path: data/duped.12b.43000-*
- split: duped.12b.63000
path: data/duped.12b.63000-*
- split: duped.12b.83000
path: data/duped.12b.83000-*
- split: duped.12b.103000
path: data/duped.12b.103000-*
- split: duped.12b.123000
path: data/duped.12b.123000-*
- split: deduped.12b.23000
path: data/deduped.12b.23000-*
- split: deduped.12b.43000
path: data/deduped.12b.43000-*
- split: deduped.12b.63000
path: data/deduped.12b.63000-*
- split: deduped.12b.83000
path: data/deduped.12b.83000-*
- split: deduped.12b.103000
path: data/deduped.12b.103000-*
- split: deduped.12b.123000
path: data/deduped.12b.123000-*
- split: duped.70m
path: data/duped.70m-*
- split: duped.160m
path: data/duped.160m-*
- split: duped.410m
path: data/duped.410m-*
- split: duped.1b
path: data/duped.1b-*
- split: duped.1.4b
path: data/duped.1.4b-*
- split: duped.2.8b
path: data/duped.2.8b-*
- split: duped.6.9b
path: data/duped.6.9b-*
- split: duped.12b
path: data/duped.12b-*
- split: deduped.70m
path: data/deduped.70m-*
- split: deduped.160m
path: data/deduped.160m-*
- split: deduped.410m
path: data/deduped.410m-*
- split: deduped.1b
path: data/deduped.1b-*
- split: deduped.1.4b
path: data/deduped.1.4b-*
- split: deduped.2.8b
path: data/deduped.2.8b-*
- split: deduped.6.9b
path: data/deduped.6.9b-*
- split: deduped.12b
path: data/deduped.12b-*
- split: deduped.1b.new
path: data/deduped.1b.new-*
---
# Dataset Card for "pythia-memorized-evals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Binaryy/reddit-multimodal-sample | ---
dataset_info:
features:
- name: image
dtype: image
- name: Title
dtype: string
- name: Image URL
dtype: string
- name: File Path
dtype: string
splits:
- name: train
num_bytes: 1954008168.804
num_examples: 1638
download_size: 1843573587
dataset_size: 1954008168.804
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "reddit-multimodal-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_kimou605__shadow-clown-BioMistral-7B-DARE | ---
pretty_name: Evaluation run of kimou605/shadow-clown-BioMistral-7B-DARE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kimou605/shadow-clown-BioMistral-7B-DARE](https://huggingface.co/kimou605/shadow-clown-BioMistral-7B-DARE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kimou605__shadow-clown-BioMistral-7B-DARE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T03:43:27.988905](https://huggingface.co/datasets/open-llm-leaderboard/details_kimou605__shadow-clown-BioMistral-7B-DARE/blob/main/results_2024-03-15T03-43-27.988905.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6437006121406286,\n\
\ \"acc_stderr\": 0.03219656527666736,\n \"acc_norm\": 0.6450809586881737,\n\
\ \"acc_norm_stderr\": 0.0328491614180031,\n \"mc1\": 0.5104039167686658,\n\
\ \"mc1_stderr\": 0.017499711430249268,\n \"mc2\": 0.6767936843695579,\n\
\ \"mc2_stderr\": 0.014902102789887895\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382506,\n\
\ \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693246\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6832304321848237,\n\
\ \"acc_stderr\": 0.0046426596138574,\n \"acc_norm\": 0.867755427205736,\n\
\ \"acc_norm_stderr\": 0.0033806414709899296\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724067,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895518,\n \"\
acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650156,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650156\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n\
\ \"acc_stderr\": 0.01646981492840617,\n \"acc_norm\": 0.4134078212290503,\n\
\ \"acc_norm_stderr\": 0.01646981492840617\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279046,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279046\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\
\ \"acc_stderr\": 0.012716941720734804,\n \"acc_norm\": 0.45436766623207303,\n\
\ \"acc_norm_stderr\": 0.012716941720734804\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5104039167686658,\n\
\ \"mc1_stderr\": 0.017499711430249268,\n \"mc2\": 0.6767936843695579,\n\
\ \"mc2_stderr\": 0.014902102789887895\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305892\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5974222896133434,\n \
\ \"acc_stderr\": 0.01350852306366343\n }\n}\n```"
repo_url: https://huggingface.co/kimou605/shadow-clown-BioMistral-7B-DARE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|arc:challenge|25_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|gsm8k|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hellaswag|10_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T03-43-27.988905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T03-43-27.988905.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- '**/details_harness|winogrande|5_2024-03-15T03-43-27.988905.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T03-43-27.988905.parquet'
- config_name: results
data_files:
- split: 2024_03_15T03_43_27.988905
path:
- results_2024-03-15T03-43-27.988905.parquet
- split: latest
path:
- results_2024-03-15T03-43-27.988905.parquet
---
# Dataset Card for Evaluation run of kimou605/shadow-clown-BioMistral-7B-DARE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kimou605/shadow-clown-BioMistral-7B-DARE](https://huggingface.co/kimou605/shadow-clown-BioMistral-7B-DARE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kimou605__shadow-clown-BioMistral-7B-DARE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T03:43:27.988905](https://huggingface.co/datasets/open-llm-leaderboard/details_kimou605__shadow-clown-BioMistral-7B-DARE/blob/main/results_2024-03-15T03-43-27.988905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6437006121406286,
"acc_stderr": 0.03219656527666736,
"acc_norm": 0.6450809586881737,
"acc_norm_stderr": 0.0328491614180031,
"mc1": 0.5104039167686658,
"mc1_stderr": 0.017499711430249268,
"mc2": 0.6767936843695579,
"mc2_stderr": 0.014902102789887895
},
"harness|arc:challenge|25": {
"acc": 0.6510238907849829,
"acc_stderr": 0.013928933461382506,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693246
},
"harness|hellaswag|10": {
"acc": 0.6832304321848237,
"acc_stderr": 0.0046426596138574,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.0033806414709899296
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724067,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650156,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650156
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.01646981492840617,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.01646981492840617
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279046,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279046
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734804,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734804
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128438,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128438
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5104039167686658,
"mc1_stderr": 0.017499711430249268,
"mc2": 0.6767936843695579,
"mc2_stderr": 0.014902102789887895
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305892
},
"harness|gsm8k|5": {
"acc": 0.5974222896133434,
"acc_stderr": 0.01350852306366343
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TriadParty/deepsword | ---
license: apache-2.0
---
|
nlpso/m1_qualitative_analysis_ref_cmbert_iob2 | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m1_qualitative_analysis_ref_cmbert_iob2
## Introduction
This dataset was used to perform **qualitative analysis** of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approach : M1
* Dataset type : ground-truth
* Tokenizer : [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner)
* Tagging format : IOB2
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* Level-1 : [nlpso/m1_ind_layers_ref_cmbert_iob2_level_1](https://huggingface.co/nlpso/m1_ind_layers_ref_cmbert_iob2_level_1)
* Level 2 : [nlpso/m1_ind_layers_ref_cmbert_iob2_level_2](https://huggingface.co/nlpso/m1_ind_layers_ref_cmbert_iob2_level_2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m1_qualitative_analysis_ref_cmbert_iob2")
|
Felladrin/ChatML-aya_dataset | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
annotations_creators:
- crowdsourced
- expert-generated
language:
- amh
- arb
- ary
- ars
- acq
- arz
- apc
- ben
- ceb
- dan
- deu
- ell
- eng
- eus
- fil
- fin
- fra
- gle
- guj
- hat
- hau
- hin
- hun
- ibo
- ind
- ita
- jav
- jpn
- kan
- kir
- kor
- kur
- lit
- mal
- mar
- mlg
- msa
- mya
- nep
- nld
- nso
- nya
- pan
- pes
- pol
- por
- pus
- rus
- sin
- sna
- snd
- som
- spa
- sqi
- srp
- sun
- swa
- swe
- tam
- tel
- tha
- tur
- ukr
- urd
- vie
- wol
- xho
- yor
- zho
- zul
language_creators:
- crowdsourced
- expert-generated
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
---
[CohereForAI/aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset) in ChatML format, ready to use in [HuggingFace TRL's SFT Trainer](https://huggingface.co/docs/trl/main/en/sft_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Felladrin/Llama-160M-Chat-v1")
dataset = load_dataset("CohereForAI/aya_dataset", split="train")
def format(columns):
messages = [
{
"role": "user",
"content": columns["inputs"].strip(),
},
{
"role": "assistant",
"content": columns["targets"].strip(),
},
]
return { "text": tokenizer.apply_chat_template(messages, tokenize=False) }
dataset.map(format).select_columns(['text', 'language', 'language_code', 'annotation_type', 'user_id']).to_parquet("train.parquet")
```
|
yapayzeka/meto | ---
license: openrail
---
|
open-llm-leaderboard/details_glenn2__gemma-2b-lora3 | ---
pretty_name: Evaluation run of glenn2/gemma-2b-lora3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [glenn2/gemma-2b-lora3](https://huggingface.co/glenn2/gemma-2b-lora3) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_glenn2__gemma-2b-lora3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T12:17:16.722780](https://huggingface.co/datasets/open-llm-leaderboard/details_glenn2__gemma-2b-lora3/blob/main/results_2024-02-29T12-17-16.722780.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38561103829004145,\n\
\ \"acc_stderr\": 0.034063362609756336,\n \"acc_norm\": 0.3876919839462579,\n\
\ \"acc_norm_stderr\": 0.034808766714813213,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.3642319222186389,\n\
\ \"mc2_stderr\": 0.013786837331146062\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4616040955631399,\n \"acc_stderr\": 0.01456824555029636,\n\
\ \"acc_norm\": 0.4726962457337884,\n \"acc_norm_stderr\": 0.014589589101985996\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5294761999601673,\n\
\ \"acc_stderr\": 0.004981103157940433,\n \"acc_norm\": 0.7182832105158335,\n\
\ \"acc_norm_stderr\": 0.004489166767430652\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739435,\n\
\ \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739435\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102149,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102149\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4032258064516129,\n \"acc_stderr\": 0.027906150826041146,\n \"\
acc_norm\": 0.4032258064516129,\n \"acc_norm_stderr\": 0.027906150826041146\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n \"\
acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4393939393939394,\n \"acc_stderr\": 0.035360859475294805,\n \"\
acc_norm\": 0.4393939393939394,\n \"acc_norm_stderr\": 0.035360859475294805\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.49222797927461137,\n \"acc_stderr\": 0.03608003225569654,\n\
\ \"acc_norm\": 0.49222797927461137,\n \"acc_norm_stderr\": 0.03608003225569654\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.02340092891831051,\n \
\ \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.02340092891831051\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3319327731092437,\n \"acc_stderr\": 0.030588697013783663,\n\
\ \"acc_norm\": 0.3319327731092437,\n \"acc_norm_stderr\": 0.030588697013783663\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.46422018348623856,\n \"acc_stderr\": 0.021382364775701906,\n \"\
acc_norm\": 0.46422018348623856,\n \"acc_norm_stderr\": 0.021382364775701906\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25462962962962965,\n \"acc_stderr\": 0.029711275860005337,\n \"\
acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.029711275860005337\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39215686274509803,\n \"acc_stderr\": 0.03426712349247272,\n \"\
acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.03426712349247272\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4092827004219409,\n \"acc_stderr\": 0.032007041833595914,\n \
\ \"acc_norm\": 0.4092827004219409,\n \"acc_norm_stderr\": 0.032007041833595914\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.35877862595419846,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.35877862595419846,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.36809815950920244,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.36809815950920244,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.04950504382128921,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.04950504382128921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n\
\ \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.5726495726495726,\n\
\ \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5325670498084292,\n\
\ \"acc_stderr\": 0.017841995750520874,\n \"acc_norm\": 0.5325670498084292,\n\
\ \"acc_norm_stderr\": 0.017841995750520874\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.34971098265895956,\n \"acc_stderr\": 0.02567428145653102,\n\
\ \"acc_norm\": 0.34971098265895956,\n \"acc_norm_stderr\": 0.02567428145653102\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.01437816988409843,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.01437816988409843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3464052287581699,\n \"acc_stderr\": 0.027245613047215355,\n\
\ \"acc_norm\": 0.3464052287581699,\n \"acc_norm_stderr\": 0.027245613047215355\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.4115755627009646,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422697,\n\
\ \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422697\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.0263580656988806,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.0263580656988806\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31747066492829207,\n\
\ \"acc_stderr\": 0.01188889206880931,\n \"acc_norm\": 0.31747066492829207,\n\
\ \"acc_norm_stderr\": 0.01188889206880931\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.28308823529411764,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.28308823529411764,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.37745098039215685,\n \"acc_stderr\": 0.019610851474880286,\n \
\ \"acc_norm\": 0.37745098039215685,\n \"acc_norm_stderr\": 0.019610851474880286\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.03191282052669277,\n\
\ \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.03191282052669277\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4925373134328358,\n\
\ \"acc_stderr\": 0.035351400842767194,\n \"acc_norm\": 0.4925373134328358,\n\
\ \"acc_norm_stderr\": 0.035351400842767194\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.3642319222186389,\n\
\ \"mc2_stderr\": 0.013786837331146062\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6724546172059984,\n \"acc_stderr\": 0.013190169546797017\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17589082638362397,\n \
\ \"acc_stderr\": 0.01048712063553962\n }\n}\n```"
repo_url: https://huggingface.co/glenn2/gemma-2b-lora3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-17-16.722780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-17-16.722780.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- '**/details_harness|winogrande|5_2024-02-29T12-17-16.722780.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T12-17-16.722780.parquet'
- config_name: results
data_files:
- split: 2024_02_29T12_17_16.722780
path:
- results_2024-02-29T12-17-16.722780.parquet
- split: latest
path:
- results_2024-02-29T12-17-16.722780.parquet
---
# Dataset Card for Evaluation run of glenn2/gemma-2b-lora3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [glenn2/gemma-2b-lora3](https://huggingface.co/glenn2/gemma-2b-lora3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_glenn2__gemma-2b-lora3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T12:17:16.722780](https://huggingface.co/datasets/open-llm-leaderboard/details_glenn2__gemma-2b-lora3/blob/main/results_2024-02-29T12-17-16.722780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38561103829004145,
"acc_stderr": 0.034063362609756336,
"acc_norm": 0.3876919839462579,
"acc_norm_stderr": 0.034808766714813213,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.3642319222186389,
"mc2_stderr": 0.013786837331146062
},
"harness|arc:challenge|25": {
"acc": 0.4616040955631399,
"acc_stderr": 0.01456824555029636,
"acc_norm": 0.4726962457337884,
"acc_norm_stderr": 0.014589589101985996
},
"harness|hellaswag|10": {
"acc": 0.5294761999601673,
"acc_stderr": 0.004981103157940433,
"acc_norm": 0.7182832105158335,
"acc_norm_stderr": 0.004489166767430652
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102149,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102149
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4032258064516129,
"acc_stderr": 0.027906150826041146,
"acc_norm": 0.4032258064516129,
"acc_norm_stderr": 0.027906150826041146
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4393939393939394,
"acc_stderr": 0.035360859475294805,
"acc_norm": 0.4393939393939394,
"acc_norm_stderr": 0.035360859475294805
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.49222797927461137,
"acc_stderr": 0.03608003225569654,
"acc_norm": 0.49222797927461137,
"acc_norm_stderr": 0.03608003225569654
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.02340092891831051,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.02340092891831051
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3319327731092437,
"acc_stderr": 0.030588697013783663,
"acc_norm": 0.3319327731092437,
"acc_norm_stderr": 0.030588697013783663
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.46422018348623856,
"acc_stderr": 0.021382364775701906,
"acc_norm": 0.46422018348623856,
"acc_norm_stderr": 0.021382364775701906
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25462962962962965,
"acc_stderr": 0.029711275860005337,
"acc_norm": 0.25462962962962965,
"acc_norm_stderr": 0.029711275860005337
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.03426712349247272,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.03426712349247272
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4092827004219409,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.4092827004219409,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.35877862595419846,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.35877862595419846,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.36809815950920244,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.36809815950920244,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.04950504382128921,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.04950504382128921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5325670498084292,
"acc_stderr": 0.017841995750520874,
"acc_norm": 0.5325670498084292,
"acc_norm_stderr": 0.017841995750520874
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.34971098265895956,
"acc_stderr": 0.02567428145653102,
"acc_norm": 0.34971098265895956,
"acc_norm_stderr": 0.02567428145653102
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.01437816988409843,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.01437816988409843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3464052287581699,
"acc_stderr": 0.027245613047215355,
"acc_norm": 0.3464052287581699,
"acc_norm_stderr": 0.027245613047215355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422697,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.0263580656988806,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.0263580656988806
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31747066492829207,
"acc_stderr": 0.01188889206880931,
"acc_norm": 0.31747066492829207,
"acc_norm_stderr": 0.01188889206880931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.28308823529411764,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.28308823529411764,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.37745098039215685,
"acc_stderr": 0.019610851474880286,
"acc_norm": 0.37745098039215685,
"acc_norm_stderr": 0.019610851474880286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.03191282052669277,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.03191282052669277
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4925373134328358,
"acc_stderr": 0.035351400842767194,
"acc_norm": 0.4925373134328358,
"acc_norm_stderr": 0.035351400842767194
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.3642319222186389,
"mc2_stderr": 0.013786837331146062
},
"harness|winogrande|5": {
"acc": 0.6724546172059984,
"acc_stderr": 0.013190169546797017
},
"harness|gsm8k|5": {
"acc": 0.17589082638362397,
"acc_stderr": 0.01048712063553962
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
eswar-2001/haskell-packages-meta-hoogle | ---
license: unknown
---
|
S-AA-D/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4200526
num_examples: 1000
download_size: 2248768
dataset_size: 4200526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_258 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1204130700.0
num_examples: 236475
download_size: 1228465140
dataset_size: 1204130700.0
---
# Dataset Card for "chunk_258"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
timothyckl/OpenOrca-20k | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 33921209
num_examples: 20000
download_size: 19510634
dataset_size: 33921209
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OpenOrca-20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_sample_validation_text_davinci_003_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 12831
num_examples: 10
download_size: 13218
dataset_size: 12831
---
# Dataset Card for "VQAv2_sample_validation_text_davinci_003_mode_T_A_D_PNP_NO_FILTER_C_Q_rices_ns_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_1713021234 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 59817487
num_examples: 155249
download_size: 30484231
dataset_size: 59817487
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Nitral-AI__KukulStanta-7B | ---
pretty_name: Evaluation run of Nitral-AI/KukulStanta-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Nitral-AI/KukulStanta-7B](https://huggingface.co/Nitral-AI/KukulStanta-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Nitral-AI__KukulStanta-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T22:02:03.624829](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__KukulStanta-7B/blob/main/results_2024-04-02T22-02-03.624829.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530472574937952,\n\
\ \"acc_stderr\": 0.032066627003960894,\n \"acc_norm\": 0.6542386436485469,\n\
\ \"acc_norm_stderr\": 0.032713718640515456,\n \"mc1\": 0.4467564259485924,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6218700196422391,\n\
\ \"mc2_stderr\": 0.015309530764000967\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n\
\ \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6879107747460665,\n\
\ \"acc_stderr\": 0.004623990785158488,\n \"acc_norm\": 0.8636725751842262,\n\
\ \"acc_norm_stderr\": 0.0034243464481037203\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.01637696614261008,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.01637696614261008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653356,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653356\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4467564259485924,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6218700196422391,\n\
\ \"mc2_stderr\": 0.015309530764000967\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.01123532838262585\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6368460955269143,\n \
\ \"acc_stderr\": 0.013246614539839868\n }\n}\n```"
repo_url: https://huggingface.co/Nitral-AI/KukulStanta-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|arc:challenge|25_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|gsm8k|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hellaswag|10_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-02-03.624829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T22-02-03.624829.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- '**/details_harness|winogrande|5_2024-04-02T22-02-03.624829.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T22-02-03.624829.parquet'
- config_name: results
data_files:
- split: 2024_04_02T22_02_03.624829
path:
- results_2024-04-02T22-02-03.624829.parquet
- split: latest
path:
- results_2024-04-02T22-02-03.624829.parquet
---
# Dataset Card for Evaluation run of Nitral-AI/KukulStanta-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Nitral-AI/KukulStanta-7B](https://huggingface.co/Nitral-AI/KukulStanta-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Nitral-AI__KukulStanta-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T22:02:03.624829](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__KukulStanta-7B/blob/main/results_2024-04-02T22-02-03.624829.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530472574937952,
"acc_stderr": 0.032066627003960894,
"acc_norm": 0.6542386436485469,
"acc_norm_stderr": 0.032713718640515456,
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6218700196422391,
"mc2_stderr": 0.015309530764000967
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497723,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815293
},
"harness|hellaswag|10": {
"acc": 0.6879107747460665,
"acc_stderr": 0.004623990785158488,
"acc_norm": 0.8636725751842262,
"acc_norm_stderr": 0.0034243464481037203
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507337,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.01637696614261008,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.01637696614261008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653356,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653356
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6218700196422391,
"mc2_stderr": 0.015309530764000967
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.01123532838262585
},
"harness|gsm8k|5": {
"acc": 0.6368460955269143,
"acc_stderr": 0.013246614539839868
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/pina_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pina/朝比奈フィーナ/菲娜 (Blue Archive)
This is the dataset of pina/朝比奈フィーナ/菲娜 (Blue Archive), containing 39 images and their tags.
The core tags of this character are `blonde_hair, breasts, long_hair, blue_eyes, large_breasts, ponytail, halo, braid, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 63.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pina_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 39 | 56.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pina_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 101 | 112.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pina_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pina_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 39 |  |  |  |  |  | cleavage, 1girl, looking_at_viewer, solo, bare_shoulders, smile, open_mouth, red_kimono, blush, collarbone, skirt, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | cleavage | 1girl | looking_at_viewer | solo | bare_shoulders | smile | open_mouth | red_kimono | blush | collarbone | skirt | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------|:--------|:--------------------|:-------|:-----------------|:--------|:-------------|:-------------|:--------|:-------------|:--------|:-------------------|
| 0 | 39 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
Syed-Hasan-8503/distilabel-intel-orca-dpo-pairs-rearranged | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 33022965
num_examples: 12859
download_size: 19066541
dataset_size: 33022965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/astesia_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of astesia/アステシア/星极 (Arknights)
This is the dataset of astesia/アステシア/星极 (Arknights), containing 389 images and their tags.
The core tags of this character are `blue_hair, long_hair, blue_eyes, bright_pupils, breasts, hair_intakes, white_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 389 | 877.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astesia_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 389 | 710.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astesia_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1009 | 1.31 GiB | [Download](https://huggingface.co/datasets/CyberHarem/astesia_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/astesia_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_footwear, black_gloves, elbow_gloves, full_body, hat, high_heels, holding_sword, puffy_short_sleeves, solo, white_dress, white_headwear, blue_dress, closed_mouth, looking_at_viewer, official_alternate_costume, smile, standing, white_background, globe, simple_background, star_(sky) |
| 1 | 7 |  |  |  |  |  | 1girl, black_gloves, hat, simple_background, solo, white_background, white_headwear, closed_mouth, elbow_gloves, looking_at_viewer, medium_breasts, puffy_short_sleeves, smile, cleavage, official_alternate_costume, hand_up, upper_body, blue_dress, ribbon, white_dress |
| 2 | 11 |  |  |  |  |  | 1girl, long_sleeves, solo, cowboy_shot, holding_sword, looking_at_viewer, white_shirt, blue_skirt, closed_mouth, star_(symbol), white_background, diamond-shaped_pupils, neck_ribbon, puffy_sleeves, simple_background |
| 3 | 11 |  |  |  |  |  | 1girl, holding_sword, long_sleeves, solo, full_body, looking_at_viewer, white_shirt, simple_background, black_footwear, black_skirt, closed_mouth, high_heels, white_background, standing, globe, dress |
| 4 | 16 |  |  |  |  |  | 1girl, closed_mouth, looking_at_viewer, solo, white_shirt, simple_background, upper_body, white_background, smile, braid, collared_shirt, long_sleeves, neck_ribbon, puffy_sleeves, diamond_(shape) |
| 5 | 11 |  |  |  |  |  | 1girl, official_alternate_costume, solo, white_dress, white_gloves, bird, looking_at_viewer, holding_microphone, star_(symbol), bare_shoulders, very_long_hair, sleeveless_dress, closed_mouth, parted_lips, smile, white_background |
| 6 | 15 |  |  |  |  |  | 1girl, solo, white_dress, white_gloves, holding_sword, looking_at_viewer, official_alternate_costume, very_long_hair, hair_ornament, bare_shoulders, bird, parted_lips, sleeveless, hair_between_eyes, thighhighs |
| 7 | 7 |  |  |  |  |  | 1girl, blush, hetero, solo_focus, 1boy, large_breasts, navel, nipples, open_mouth, penis, sex, vaginal, completely_nude, spread_legs, cum_in_pussy, heart, mosaic_censoring, necklace, arms_up, collarbone, smile, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_footwear | black_gloves | elbow_gloves | full_body | hat | high_heels | holding_sword | puffy_short_sleeves | solo | white_dress | white_headwear | blue_dress | closed_mouth | looking_at_viewer | official_alternate_costume | smile | standing | white_background | globe | simple_background | star_(sky) | medium_breasts | cleavage | hand_up | upper_body | ribbon | long_sleeves | cowboy_shot | white_shirt | blue_skirt | star_(symbol) | diamond-shaped_pupils | neck_ribbon | puffy_sleeves | black_skirt | dress | braid | collared_shirt | diamond_(shape) | white_gloves | bird | holding_microphone | bare_shoulders | very_long_hair | sleeveless_dress | parted_lips | hair_ornament | sleeveless | hair_between_eyes | thighhighs | blush | hetero | solo_focus | 1boy | large_breasts | navel | nipples | open_mouth | penis | sex | vaginal | completely_nude | spread_legs | cum_in_pussy | heart | mosaic_censoring | necklace | arms_up | collarbone | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:---------------|:------------|:------|:-------------|:----------------|:----------------------|:-------|:--------------|:-----------------|:-------------|:---------------|:--------------------|:-----------------------------|:--------|:-----------|:-------------------|:--------|:--------------------|:-------------|:-----------------|:-----------|:----------|:-------------|:---------|:---------------|:--------------|:--------------|:-------------|:----------------|:------------------------|:--------------|:----------------|:--------------|:--------|:--------|:-----------------|:------------------|:---------------|:-------|:---------------------|:-----------------|:-----------------|:-------------------|:--------------|:----------------|:-------------|:--------------------|:-------------|:--------|:---------|:-------------|:-------|:----------------|:--------|:----------|:-------------|:--------|:------|:----------|:------------------|:--------------|:---------------|:--------|:-------------------|:-----------|:----------|:-------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | X | | X | | | X | X | X | X | X | X | X | X | X | | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | | | | | X | | X | | | | X | X | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | | X | | X | X | | X | | | | X | X | | | X | X | X | X | | | | | | | X | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 16 |  |  |  |  |  | X | | | | | | | | | X | | | | X | X | | X | | X | | X | | | | | X | | X | | X | | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | | | | | | | | X | X | | | X | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | | | | | | X | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf | ---
pretty_name: Evaluation run of NousResearch/CodeLlama-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/CodeLlama-7b-hf](https://huggingface.co/NousResearch/CodeLlama-7b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T04:22:12.772861](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf/blob/main/results_2023-09-17T04-22-12.772861.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n\
\ \"em_stderr\": 0.00023443780464835895,\n \"f1\": 0.05166212248322184,\n\
\ \"f1_stderr\": 0.0012470290169941962,\n \"acc\": 0.3516817229574676,\n\
\ \"acc_stderr\": 0.00983671270422883\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464835895,\n\
\ \"f1\": 0.05166212248322184,\n \"f1_stderr\": 0.0012470290169941962\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \
\ \"acc_stderr\": 0.006257444037912531\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.648776637726914,\n \"acc_stderr\": 0.013415981370545131\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NousResearch/CodeLlama-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T04_22_12.772861
path:
- '**/details_harness|drop|3_2023-09-17T04-22-12.772861.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T04-22-12.772861.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T04_22_12.772861
path:
- '**/details_harness|gsm8k|5_2023-09-17T04-22-12.772861.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T04-22-12.772861.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T04_22_12.772861
path:
- '**/details_harness|winogrande|5_2023-09-17T04-22-12.772861.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T04-22-12.772861.parquet'
- config_name: results
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- results_2023-08-26T00:03:47.670325.parquet
- split: 2023_09_17T04_22_12.772861
path:
- results_2023-09-17T04-22-12.772861.parquet
- split: latest
path:
- results_2023-09-17T04-22-12.772861.parquet
---
# Dataset Card for Evaluation run of NousResearch/CodeLlama-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/CodeLlama-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/CodeLlama-7b-hf](https://huggingface.co/NousResearch/CodeLlama-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T04:22:12.772861](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf/blob/main/results_2023-09-17T04-22-12.772861.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464835895,
"f1": 0.05166212248322184,
"f1_stderr": 0.0012470290169941962,
"acc": 0.3516817229574676,
"acc_stderr": 0.00983671270422883
},
"harness|drop|3": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464835895,
"f1": 0.05166212248322184,
"f1_stderr": 0.0012470290169941962
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.006257444037912531
},
"harness|winogrande|5": {
"acc": 0.648776637726914,
"acc_stderr": 0.013415981370545131
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
amishshah/imbalanced_0 | ---
dataset_info:
features:
- name: title
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 45166669.74
num_examples: 27000
- name: test
num_bytes: 5018518.86
num_examples: 3000
download_size: 0
dataset_size: 50185188.6
---
# Dataset Card for "imbalanced_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TitleOS/rocketraccoon_personality_alpaca | ---
license: mpl-2.0
task_categories:
- text-generation
language:
- en
tags:
- alpaca
- synthetic data
size_categories:
- n<1K
---
An attempt to imbue a gruff, RocketRaccoon like personality from GoG in the Rocket 3B model. Alpaca formatted dataset generated by ehartford_dolphin-2.2.1-mistral-7b. |
yjching/tokenized_ts_tracks | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 114719
num_examples: 8
download_size: 46092
dataset_size: 114719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "yjching"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/la_pluma_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of la_pluma/ラ・プルマ/羽毛笔 (Arknights)
This is the dataset of la_pluma/ラ・プルマ/羽毛笔 (Arknights), containing 500 images and their tags.
The core tags of this character are `black_hair, short_hair, breasts, hair_ornament, ahoge, hairclip, medium_breasts, black_eyes, hair_intakes, feather_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1009.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/la_pluma_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 810.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/la_pluma_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1319 | 1.53 GiB | [Download](https://huggingface.co/datasets/CyberHarem/la_pluma_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/la_pluma_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, bare_shoulders, sleeveless_shirt, solo, looking_at_viewer, full_body, fur-trimmed_jacket, open_clothes, thigh_strap, black_footwear, off_shoulder, sandals, black_gloves, black_shorts, simple_background, white_background, black_coat, black_jacket, holding_scythe, black_shirt, fingerless_gloves, fur-trimmed_coat, short_shorts, holding_weapon |
| 1 | 10 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, off_shoulder, simple_background, sleeveless_shirt, solo, upper_body, white_background, black_shirt, open_clothes, closed_mouth, black_jacket, fur-trimmed_jacket, blush |
| 2 | 8 |  |  |  |  |  | 1girl, bare_shoulders, goggles_around_neck, hair_bun, jacket_over_swimsuit, looking_at_viewer, off_shoulder, official_alternate_costume, open_jacket, solo, white_bikini, cleavage, upper_body, white_jacket, long_sleeves, whistle, white_background, mouth_hold, navel, simple_background |
| 3 | 14 |  |  |  |  |  | 1girl, beachball, goggles_around_neck, jacket_over_swimsuit, long_sleeves, looking_at_viewer, official_alternate_costume, open_jacket, solo, white_bikini, white_jacket, bare_shoulders, navel, off_shoulder, stomach, id_card, lanyard, water, whistle, holding, cleavage, mouth_hold, thigh_strap, frilled_bikini, innertube, sitting, swim_ring, cowboy_shot, feet_out_of_frame, hair_bun, soaking_feet |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, day, goggles_around_neck, id_card, lanyard, looking_at_viewer, off_shoulder, official_alternate_costume, open_jacket, outdoors, sitting, solo, thigh_strap, water, white_bikini, white_jacket, blue_sky, cleavage, grey_eyes, jacket_over_swimsuit, long_sleeves, navel, ocean, single_side_bun, beachball, blush, frilled_bikini, stomach, thighs, barefoot, cloudy_sky, feet, toes, whistle |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | sleeveless_shirt | solo | looking_at_viewer | full_body | fur-trimmed_jacket | open_clothes | thigh_strap | black_footwear | off_shoulder | sandals | black_gloves | black_shorts | simple_background | white_background | black_coat | black_jacket | holding_scythe | black_shirt | fingerless_gloves | fur-trimmed_coat | short_shorts | holding_weapon | upper_body | closed_mouth | blush | goggles_around_neck | hair_bun | jacket_over_swimsuit | official_alternate_costume | open_jacket | white_bikini | cleavage | white_jacket | long_sleeves | whistle | mouth_hold | navel | beachball | stomach | id_card | lanyard | water | holding | frilled_bikini | innertube | sitting | swim_ring | cowboy_shot | feet_out_of_frame | soaking_feet | day | outdoors | blue_sky | grey_eyes | ocean | single_side_bun | thighs | barefoot | cloudy_sky | feet | toes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------------------|:-------|:--------------------|:------------|:---------------------|:---------------|:--------------|:-----------------|:---------------|:----------|:---------------|:---------------|:--------------------|:-------------------|:-------------|:---------------|:-----------------|:--------------|:--------------------|:-------------------|:---------------|:-----------------|:-------------|:---------------|:--------|:----------------------|:-----------|:-----------------------|:-----------------------------|:--------------|:---------------|:-----------|:---------------|:---------------|:----------|:-------------|:--------|:------------|:----------|:----------|:----------|:--------|:----------|:-----------------|:------------|:----------|:------------|:--------------|:--------------------|:---------------|:------|:-----------|:-----------|:------------|:--------|:------------------|:---------|:-----------|:-------------|:-------|:-------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | | | | X | X | | X | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | X | X | | | | | | X | | | | X | X | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | | X | X | | | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | X | | | | X | | X | | | | | | | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
tet550/jawiki_sentences | ---
license: cc-by-3.0
dataset_info:
features:
- name: article_title
dtype: string
- name: topic_title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 7170293044
num_examples: 31888063
download_size: 3521442667
dataset_size: 7170293044
language:
- ja
---
# Jawiki Sentences Dataset
このデータセットは、日本語版Wikipediaの記事を元に作成されました。原文からできる限り不要なタグや表など文章にならないものを取り除いています。各エントリーには、その文が含まれる記事タイトル、セクションタイトルを含めています。
## データの構造
各エントリーは以下の3つのフィールドからなります:
- article_title: 記事のタイトルを表す文字列。
- topic_title: 記事のセクションタイトルを表す文字列。
- text: セクションのテキストを表す文字列。
## データの生成
このデータセットは、下記スクリプトで日本語Wikipediaダンプファイルから生成しています。
https://github.com/tet550/jawiki_sentences
## ライセンス
ウィキペディアのコンテンツは Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0) および GNU Free Documentation License (GFDL) の下にライセンスされています。
このデータセットも同ライセンスに準じます。
|
anan-2024/twitter_dataset_1713026453 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26687
num_examples: 60
download_size: 13363
dataset_size: 26687
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/dollyaug-standardized_cluster_2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 15539998
num_examples: 1553
download_size: 4866247
dataset_size: 15539998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dollyaug-standardized_cluster_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0x22almostEvil/multilingual-wikihow-qa-16k | ---
license: cc-by-nc-3.0
task_categories:
- question-answering
language:
- en
- ru
- pt
- it
- es
- fr
- de
- nl
pretty_name: multilingual-wikihow-qa-16k
size_categories:
- 10K<n<100K
tags:
- wikihow
- QnA
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
- name: METADATA
dtype: string
splits:
- name: train
num_bytes: 144407512
num_examples: 16822
download_size: 76391535
dataset_size: 144407512
---
# Dataset Card for multilingual WikiHow with ~16.8K entries. ~(2-2.2)K for each language.
### Warning [1]
The WikiHow team contacted me and made it clear that **they forbid the use of their data for machine learning purposes**. However, I am not calling for anything, and this dataset only shows the concept, and I strongly advise against violating their ToS.
However, consultation with lawyers made it clear that **dataset can be used for such purposes** if the project has **research purposes**.
### Warning [2]
Source code is kinda **very** bad, and I'm lazy to fix it.
### Dataset Summary
Contains Parquet of a list of instructions and WikiHow articles on different languages.
Each row consists of
* INSTRUCTION
* RESPONSE
* SOURCE (*.wikihow.com)
* METADATA (json with url and language).
### Licensing Information
Data is from WikiHow, license for content is located here:
https://www.wikihow.com/wikiHow:Creative-Commons
### Acknowledgements
This helped me a lot!
https://github.com/HelloChatterbox/PyWikiHow; https://pypi.org/project/pywikihow/ |
MatrixStudio/Codeforces-Python-Submissions-RL | ---
dataset_info:
features:
- name: contestId
dtype: int64
- name: index
dtype: string
- name: name
dtype: string
- name: type
dtype: string
- name: rating
dtype: int64
- name: tags
sequence: string
- name: title
dtype: string
- name: time-limit
dtype: string
- name: memory-limit
dtype: string
- name: problem-description
dtype: string
- name: input-specification
dtype: string
- name: output-specification
dtype: string
- name: demo-input
sequence: string
- name: demo-output
sequence: string
- name: note
dtype: string
- name: points
dtype: float64
- name: test_cases
list:
- name: input
dtype: string
- name: output
dtype: string
- name: creationTimeSeconds
dtype: int64
- name: relativeTimeSeconds
dtype: int64
- name: programmingLanguage
dtype: string
- name: verdict
dtype: string
- name: testset
dtype: string
- name: passedTestCount
dtype: int64
- name: timeConsumedMillis
dtype: int64
- name: memoryConsumedBytes
dtype: int64
- name: code
dtype: string
- name: prompt
dtype: string
- name: response
sequence: string
- name: score
dtype: float64
- name: reject
dtype: string
splits:
- name: train
num_bytes: 938669463.4435301
num_examples: 127921
- name: test
num_bytes: 113004434.19191754
num_examples: 15784
download_size: 197576147
dataset_size: 1051673897.6354476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
brainer/dacon_low_res_bird_img | ---
dataset_info:
- config_name: default
features:
- name: img_path
dtype: string
- name: upscale_img_path
dtype: string
- name: label
dtype: string
- name: upscale_img
dtype: image
- name: img
dtype: image
splits:
- name: train
num_bytes: 1838069640.5
num_examples: 15834
download_size: 1838780556
dataset_size: 1838069640.5
- config_name: test
features:
- name: id
dtype: string
- name: img_path
dtype: string
- name: img
dtype: image
splits:
- name: train
num_bytes: 54917445.75
num_examples: 6786
download_size: 55311599
dataset_size: 54917445.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: test
data_files:
- split: train
path: test/train-*
---
|
biglam/on_the_books | ---
language:
- en
license: cc-by-3.0
size_categories:
- 1K<n<10K
task_categories:
- text-classification
pretty_name: On the Books Training Set
dataset_info:
features:
- name: id
dtype: string
- name: source
dtype: string
- name: jim_crow
dtype:
class_label:
names:
'0': no_jim_crow
'1': jim_crow
- name: type
dtype: string
- name: chapter_num
dtype: int32
- name: section_num
dtype: int32
- name: chapter_text
dtype: string
- name: section_text
dtype: string
splits:
- name: train
num_bytes: 2119395
num_examples: 1785
download_size: 944579
dataset_size: 2119395
tags:
- lam
- legal
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fanxiaonan/event_classification | ---
license: mit
---
# event classifcaiton dataset |
AI4M/michael-af | ---
license: apache-2.0
---
|
susnato/testing10_v1_features | ---
dataset_info:
features:
- name: query_emb_0
dtype: float64
- name: query_emb_1
dtype: float64
- name: query_emb_2
dtype: float64
- name: query_emb_3
dtype: float64
- name: query_emb_4
dtype: float64
- name: query_emb_5
dtype: float64
- name: query_emb_6
dtype: float64
- name: query_emb_7
dtype: float64
- name: query_emb_8
dtype: float64
- name: query_emb_9
dtype: float64
- name: query_emb_10
dtype: float64
- name: query_emb_11
dtype: float64
- name: query_emb_12
dtype: float64
- name: query_emb_13
dtype: float64
- name: query_emb_14
dtype: float64
- name: query_emb_15
dtype: float64
- name: query_emb_16
dtype: float64
- name: query_emb_17
dtype: float64
- name: query_emb_18
dtype: float64
- name: query_emb_19
dtype: float64
- name: query_emb_20
dtype: float64
- name: query_emb_21
dtype: float64
- name: query_emb_22
dtype: float64
- name: query_emb_23
dtype: float64
- name: query_emb_24
dtype: float64
- name: query_emb_25
dtype: float64
- name: query_emb_26
dtype: float64
- name: query_emb_27
dtype: float64
- name: query_emb_28
dtype: float64
- name: query_emb_29
dtype: float64
- name: query_emb_30
dtype: float64
- name: query_emb_31
dtype: float64
- name: query_emb_32
dtype: float64
- name: query_emb_33
dtype: float64
- name: query_emb_34
dtype: float64
- name: query_emb_35
dtype: float64
- name: query_emb_36
dtype: float64
- name: query_emb_37
dtype: float64
- name: query_emb_38
dtype: float64
- name: query_emb_39
dtype: float64
- name: query_emb_40
dtype: float64
- name: query_emb_41
dtype: float64
- name: query_emb_42
dtype: float64
- name: query_emb_43
dtype: float64
- name: query_emb_44
dtype: float64
- name: query_emb_45
dtype: float64
- name: query_emb_46
dtype: float64
- name: query_emb_47
dtype: float64
- name: query_emb_48
dtype: float64
- name: query_emb_49
dtype: float64
- name: query_emb_50
dtype: float64
- name: query_emb_51
dtype: float64
- name: query_emb_52
dtype: float64
- name: query_emb_53
dtype: float64
- name: query_emb_54
dtype: float64
- name: query_emb_55
dtype: float64
- name: query_emb_56
dtype: float64
- name: query_emb_57
dtype: float64
- name: query_emb_58
dtype: float64
- name: query_emb_59
dtype: float64
- name: query_emb_60
dtype: float64
- name: query_emb_61
dtype: float64
- name: query_emb_62
dtype: float64
- name: query_emb_63
dtype: float64
- name: query_emb_64
dtype: float64
- name: query_emb_65
dtype: float64
- name: query_emb_66
dtype: float64
- name: query_emb_67
dtype: float64
- name: query_emb_68
dtype: float64
- name: query_emb_69
dtype: float64
- name: query_emb_70
dtype: float64
- name: query_emb_71
dtype: float64
- name: query_emb_72
dtype: float64
- name: query_emb_73
dtype: float64
- name: query_emb_74
dtype: float64
- name: query_emb_75
dtype: float64
- name: query_emb_76
dtype: float64
- name: query_emb_77
dtype: float64
- name: query_emb_78
dtype: float64
- name: query_emb_79
dtype: float64
- name: query_emb_80
dtype: float64
- name: query_emb_81
dtype: float64
- name: query_emb_82
dtype: float64
- name: query_emb_83
dtype: float64
- name: query_emb_84
dtype: float64
- name: query_emb_85
dtype: float64
- name: query_emb_86
dtype: float64
- name: query_emb_87
dtype: float64
- name: query_emb_88
dtype: float64
- name: query_emb_89
dtype: float64
- name: query_emb_90
dtype: float64
- name: query_emb_91
dtype: float64
- name: query_emb_92
dtype: float64
- name: query_emb_93
dtype: float64
- name: query_emb_94
dtype: float64
- name: query_emb_95
dtype: float64
- name: query_emb_96
dtype: float64
- name: query_emb_97
dtype: float64
- name: query_emb_98
dtype: float64
- name: query_emb_99
dtype: float64
- name: query_emb_100
dtype: float64
- name: query_emb_101
dtype: float64
- name: query_emb_102
dtype: float64
- name: query_emb_103
dtype: float64
- name: query_emb_104
dtype: float64
- name: query_emb_105
dtype: float64
- name: query_emb_106
dtype: float64
- name: query_emb_107
dtype: float64
- name: query_emb_108
dtype: float64
- name: query_emb_109
dtype: float64
- name: query_emb_110
dtype: float64
- name: query_emb_111
dtype: float64
- name: query_emb_112
dtype: float64
- name: query_emb_113
dtype: float64
- name: query_emb_114
dtype: float64
- name: query_emb_115
dtype: float64
- name: query_emb_116
dtype: float64
- name: query_emb_117
dtype: float64
- name: query_emb_118
dtype: float64
- name: query_emb_119
dtype: float64
- name: query_emb_120
dtype: float64
- name: query_emb_121
dtype: float64
- name: query_emb_122
dtype: float64
- name: query_emb_123
dtype: float64
- name: query_emb_124
dtype: float64
- name: query_emb_125
dtype: float64
- name: query_emb_126
dtype: float64
- name: query_emb_127
dtype: float64
- name: query_emb_128
dtype: float64
- name: query_emb_129
dtype: float64
- name: query_emb_130
dtype: float64
- name: query_emb_131
dtype: float64
- name: query_emb_132
dtype: float64
- name: query_emb_133
dtype: float64
- name: query_emb_134
dtype: float64
- name: query_emb_135
dtype: float64
- name: query_emb_136
dtype: float64
- name: query_emb_137
dtype: float64
- name: query_emb_138
dtype: float64
- name: query_emb_139
dtype: float64
- name: query_emb_140
dtype: float64
- name: query_emb_141
dtype: float64
- name: query_emb_142
dtype: float64
- name: query_emb_143
dtype: float64
- name: query_emb_144
dtype: float64
- name: query_emb_145
dtype: float64
- name: query_emb_146
dtype: float64
- name: query_emb_147
dtype: float64
- name: query_emb_148
dtype: float64
- name: query_emb_149
dtype: float64
- name: query_emb_150
dtype: float64
- name: query_emb_151
dtype: float64
- name: query_emb_152
dtype: float64
- name: query_emb_153
dtype: float64
- name: query_emb_154
dtype: float64
- name: query_emb_155
dtype: float64
- name: query_emb_156
dtype: float64
- name: query_emb_157
dtype: float64
- name: query_emb_158
dtype: float64
- name: query_emb_159
dtype: float64
- name: query_emb_160
dtype: float64
- name: query_emb_161
dtype: float64
- name: query_emb_162
dtype: float64
- name: query_emb_163
dtype: float64
- name: query_emb_164
dtype: float64
- name: query_emb_165
dtype: float64
- name: query_emb_166
dtype: float64
- name: query_emb_167
dtype: float64
- name: query_emb_168
dtype: float64
- name: query_emb_169
dtype: float64
- name: query_emb_170
dtype: float64
- name: query_emb_171
dtype: float64
- name: query_emb_172
dtype: float64
- name: query_emb_173
dtype: float64
- name: query_emb_174
dtype: float64
- name: query_emb_175
dtype: float64
- name: query_emb_176
dtype: float64
- name: query_emb_177
dtype: float64
- name: query_emb_178
dtype: float64
- name: query_emb_179
dtype: float64
- name: query_emb_180
dtype: float64
- name: query_emb_181
dtype: float64
- name: query_emb_182
dtype: float64
- name: query_emb_183
dtype: float64
- name: query_emb_184
dtype: float64
- name: query_emb_185
dtype: float64
- name: query_emb_186
dtype: float64
- name: query_emb_187
dtype: float64
- name: query_emb_188
dtype: float64
- name: query_emb_189
dtype: float64
- name: query_emb_190
dtype: float64
- name: query_emb_191
dtype: float64
- name: query_emb_192
dtype: float64
- name: query_emb_193
dtype: float64
- name: query_emb_194
dtype: float64
- name: query_emb_195
dtype: float64
- name: query_emb_196
dtype: float64
- name: query_emb_197
dtype: float64
- name: query_emb_198
dtype: float64
- name: query_emb_199
dtype: float64
- name: query_emb_200
dtype: float64
- name: query_emb_201
dtype: float64
- name: query_emb_202
dtype: float64
- name: query_emb_203
dtype: float64
- name: query_emb_204
dtype: float64
- name: query_emb_205
dtype: float64
- name: query_emb_206
dtype: float64
- name: query_emb_207
dtype: float64
- name: query_emb_208
dtype: float64
- name: query_emb_209
dtype: float64
- name: query_emb_210
dtype: float64
- name: query_emb_211
dtype: float64
- name: query_emb_212
dtype: float64
- name: query_emb_213
dtype: float64
- name: query_emb_214
dtype: float64
- name: query_emb_215
dtype: float64
- name: query_emb_216
dtype: float64
- name: query_emb_217
dtype: float64
- name: query_emb_218
dtype: float64
- name: query_emb_219
dtype: float64
- name: query_emb_220
dtype: float64
- name: query_emb_221
dtype: float64
- name: query_emb_222
dtype: float64
- name: query_emb_223
dtype: float64
- name: query_emb_224
dtype: float64
- name: query_emb_225
dtype: float64
- name: query_emb_226
dtype: float64
- name: query_emb_227
dtype: float64
- name: query_emb_228
dtype: float64
- name: query_emb_229
dtype: float64
- name: query_emb_230
dtype: float64
- name: query_emb_231
dtype: float64
- name: query_emb_232
dtype: float64
- name: query_emb_233
dtype: float64
- name: query_emb_234
dtype: float64
- name: query_emb_235
dtype: float64
- name: query_emb_236
dtype: float64
- name: query_emb_237
dtype: float64
- name: query_emb_238
dtype: float64
- name: query_emb_239
dtype: float64
- name: query_emb_240
dtype: float64
- name: query_emb_241
dtype: float64
- name: query_emb_242
dtype: float64
- name: query_emb_243
dtype: float64
- name: query_emb_244
dtype: float64
- name: query_emb_245
dtype: float64
- name: query_emb_246
dtype: float64
- name: query_emb_247
dtype: float64
- name: query_emb_248
dtype: float64
- name: query_emb_249
dtype: float64
- name: query_emb_250
dtype: float64
- name: query_emb_251
dtype: float64
- name: query_emb_252
dtype: float64
- name: query_emb_253
dtype: float64
- name: query_emb_254
dtype: float64
- name: query_emb_255
dtype: float64
- name: query_emb_256
dtype: float64
- name: query_emb_257
dtype: float64
- name: query_emb_258
dtype: float64
- name: query_emb_259
dtype: float64
- name: query_emb_260
dtype: float64
- name: query_emb_261
dtype: float64
- name: query_emb_262
dtype: float64
- name: query_emb_263
dtype: float64
- name: query_emb_264
dtype: float64
- name: query_emb_265
dtype: float64
- name: query_emb_266
dtype: float64
- name: query_emb_267
dtype: float64
- name: query_emb_268
dtype: float64
- name: query_emb_269
dtype: float64
- name: query_emb_270
dtype: float64
- name: query_emb_271
dtype: float64
- name: query_emb_272
dtype: float64
- name: query_emb_273
dtype: float64
- name: query_emb_274
dtype: float64
- name: query_emb_275
dtype: float64
- name: query_emb_276
dtype: float64
- name: query_emb_277
dtype: float64
- name: query_emb_278
dtype: float64
- name: query_emb_279
dtype: float64
- name: query_emb_280
dtype: float64
- name: query_emb_281
dtype: float64
- name: query_emb_282
dtype: float64
- name: query_emb_283
dtype: float64
- name: query_emb_284
dtype: float64
- name: query_emb_285
dtype: float64
- name: query_emb_286
dtype: float64
- name: query_emb_287
dtype: float64
- name: query_emb_288
dtype: float64
- name: query_emb_289
dtype: float64
- name: query_emb_290
dtype: float64
- name: query_emb_291
dtype: float64
- name: query_emb_292
dtype: float64
- name: query_emb_293
dtype: float64
- name: query_emb_294
dtype: float64
- name: query_emb_295
dtype: float64
- name: query_emb_296
dtype: float64
- name: query_emb_297
dtype: float64
- name: query_emb_298
dtype: float64
- name: query_emb_299
dtype: float64
- name: query_emb_300
dtype: float64
- name: query_emb_301
dtype: float64
- name: query_emb_302
dtype: float64
- name: query_emb_303
dtype: float64
- name: query_emb_304
dtype: float64
- name: query_emb_305
dtype: float64
- name: query_emb_306
dtype: float64
- name: query_emb_307
dtype: float64
- name: query_emb_308
dtype: float64
- name: query_emb_309
dtype: float64
- name: query_emb_310
dtype: float64
- name: query_emb_311
dtype: float64
- name: query_emb_312
dtype: float64
- name: query_emb_313
dtype: float64
- name: query_emb_314
dtype: float64
- name: query_emb_315
dtype: float64
- name: query_emb_316
dtype: float64
- name: query_emb_317
dtype: float64
- name: query_emb_318
dtype: float64
- name: query_emb_319
dtype: float64
- name: query_emb_320
dtype: float64
- name: query_emb_321
dtype: float64
- name: query_emb_322
dtype: float64
- name: query_emb_323
dtype: float64
- name: query_emb_324
dtype: float64
- name: query_emb_325
dtype: float64
- name: query_emb_326
dtype: float64
- name: query_emb_327
dtype: float64
- name: query_emb_328
dtype: float64
- name: query_emb_329
dtype: float64
- name: query_emb_330
dtype: float64
- name: query_emb_331
dtype: float64
- name: query_emb_332
dtype: float64
- name: query_emb_333
dtype: float64
- name: query_emb_334
dtype: float64
- name: query_emb_335
dtype: float64
- name: query_emb_336
dtype: float64
- name: query_emb_337
dtype: float64
- name: query_emb_338
dtype: float64
- name: query_emb_339
dtype: float64
- name: query_emb_340
dtype: float64
- name: query_emb_341
dtype: float64
- name: query_emb_342
dtype: float64
- name: query_emb_343
dtype: float64
- name: query_emb_344
dtype: float64
- name: query_emb_345
dtype: float64
- name: query_emb_346
dtype: float64
- name: query_emb_347
dtype: float64
- name: query_emb_348
dtype: float64
- name: query_emb_349
dtype: float64
- name: query_emb_350
dtype: float64
- name: query_emb_351
dtype: float64
- name: query_emb_352
dtype: float64
- name: query_emb_353
dtype: float64
- name: query_emb_354
dtype: float64
- name: query_emb_355
dtype: float64
- name: query_emb_356
dtype: float64
- name: query_emb_357
dtype: float64
- name: query_emb_358
dtype: float64
- name: query_emb_359
dtype: float64
- name: query_emb_360
dtype: float64
- name: query_emb_361
dtype: float64
- name: query_emb_362
dtype: float64
- name: query_emb_363
dtype: float64
- name: query_emb_364
dtype: float64
- name: query_emb_365
dtype: float64
- name: query_emb_366
dtype: float64
- name: query_emb_367
dtype: float64
- name: query_emb_368
dtype: float64
- name: query_emb_369
dtype: float64
- name: query_emb_370
dtype: float64
- name: query_emb_371
dtype: float64
- name: query_emb_372
dtype: float64
- name: query_emb_373
dtype: float64
- name: query_emb_374
dtype: float64
- name: query_emb_375
dtype: float64
- name: query_emb_376
dtype: float64
- name: query_emb_377
dtype: float64
- name: query_emb_378
dtype: float64
- name: query_emb_379
dtype: float64
- name: query_emb_380
dtype: float64
- name: query_emb_381
dtype: float64
- name: query_emb_382
dtype: float64
- name: query_emb_383
dtype: float64
- name: query_emb_384
dtype: float64
- name: query_emb_385
dtype: float64
- name: query_emb_386
dtype: float64
- name: query_emb_387
dtype: float64
- name: query_emb_388
dtype: float64
- name: query_emb_389
dtype: float64
- name: query_emb_390
dtype: float64
- name: query_emb_391
dtype: float64
- name: query_emb_392
dtype: float64
- name: query_emb_393
dtype: float64
- name: query_emb_394
dtype: float64
- name: query_emb_395
dtype: float64
- name: query_emb_396
dtype: float64
- name: query_emb_397
dtype: float64
- name: query_emb_398
dtype: float64
- name: query_emb_399
dtype: float64
- name: query_emb_400
dtype: float64
- name: query_emb_401
dtype: float64
- name: query_emb_402
dtype: float64
- name: query_emb_403
dtype: float64
- name: query_emb_404
dtype: float64
- name: query_emb_405
dtype: float64
- name: query_emb_406
dtype: float64
- name: query_emb_407
dtype: float64
- name: query_emb_408
dtype: float64
- name: query_emb_409
dtype: float64
- name: query_emb_410
dtype: float64
- name: query_emb_411
dtype: float64
- name: query_emb_412
dtype: float64
- name: query_emb_413
dtype: float64
- name: query_emb_414
dtype: float64
- name: query_emb_415
dtype: float64
- name: query_emb_416
dtype: float64
- name: query_emb_417
dtype: float64
- name: query_emb_418
dtype: float64
- name: query_emb_419
dtype: float64
- name: query_emb_420
dtype: float64
- name: query_emb_421
dtype: float64
- name: query_emb_422
dtype: float64
- name: query_emb_423
dtype: float64
- name: query_emb_424
dtype: float64
- name: query_emb_425
dtype: float64
- name: query_emb_426
dtype: float64
- name: query_emb_427
dtype: float64
- name: query_emb_428
dtype: float64
- name: query_emb_429
dtype: float64
- name: query_emb_430
dtype: float64
- name: query_emb_431
dtype: float64
- name: query_emb_432
dtype: float64
- name: query_emb_433
dtype: float64
- name: query_emb_434
dtype: float64
- name: query_emb_435
dtype: float64
- name: query_emb_436
dtype: float64
- name: query_emb_437
dtype: float64
- name: query_emb_438
dtype: float64
- name: query_emb_439
dtype: float64
- name: query_emb_440
dtype: float64
- name: query_emb_441
dtype: float64
- name: query_emb_442
dtype: float64
- name: query_emb_443
dtype: float64
- name: query_emb_444
dtype: float64
- name: query_emb_445
dtype: float64
- name: query_emb_446
dtype: float64
- name: query_emb_447
dtype: float64
- name: query_emb_448
dtype: float64
- name: query_emb_449
dtype: float64
- name: query_emb_450
dtype: float64
- name: query_emb_451
dtype: float64
- name: query_emb_452
dtype: float64
- name: query_emb_453
dtype: float64
- name: query_emb_454
dtype: float64
- name: query_emb_455
dtype: float64
- name: query_emb_456
dtype: float64
- name: query_emb_457
dtype: float64
- name: query_emb_458
dtype: float64
- name: query_emb_459
dtype: float64
- name: query_emb_460
dtype: float64
- name: query_emb_461
dtype: float64
- name: query_emb_462
dtype: float64
- name: query_emb_463
dtype: float64
- name: query_emb_464
dtype: float64
- name: query_emb_465
dtype: float64
- name: query_emb_466
dtype: float64
- name: query_emb_467
dtype: float64
- name: query_emb_468
dtype: float64
- name: query_emb_469
dtype: float64
- name: query_emb_470
dtype: float64
- name: query_emb_471
dtype: float64
- name: query_emb_472
dtype: float64
- name: query_emb_473
dtype: float64
- name: query_emb_474
dtype: float64
- name: query_emb_475
dtype: float64
- name: query_emb_476
dtype: float64
- name: query_emb_477
dtype: float64
- name: query_emb_478
dtype: float64
- name: query_emb_479
dtype: float64
- name: query_emb_480
dtype: float64
- name: query_emb_481
dtype: float64
- name: query_emb_482
dtype: float64
- name: query_emb_483
dtype: float64
- name: query_emb_484
dtype: float64
- name: query_emb_485
dtype: float64
- name: query_emb_486
dtype: float64
- name: query_emb_487
dtype: float64
- name: query_emb_488
dtype: float64
- name: query_emb_489
dtype: float64
- name: query_emb_490
dtype: float64
- name: query_emb_491
dtype: float64
- name: query_emb_492
dtype: float64
- name: query_emb_493
dtype: float64
- name: query_emb_494
dtype: float64
- name: query_emb_495
dtype: float64
- name: query_emb_496
dtype: float64
- name: query_emb_497
dtype: float64
- name: query_emb_498
dtype: float64
- name: query_emb_499
dtype: float64
- name: query_emb_500
dtype: float64
- name: query_emb_501
dtype: float64
- name: query_emb_502
dtype: float64
- name: query_emb_503
dtype: float64
- name: query_emb_504
dtype: float64
- name: query_emb_505
dtype: float64
- name: query_emb_506
dtype: float64
- name: query_emb_507
dtype: float64
- name: query_emb_508
dtype: float64
- name: query_emb_509
dtype: float64
- name: query_emb_510
dtype: float64
- name: query_emb_511
dtype: float64
- name: query_emb_512
dtype: float64
- name: query_emb_513
dtype: float64
- name: query_emb_514
dtype: float64
- name: query_emb_515
dtype: float64
- name: query_emb_516
dtype: float64
- name: query_emb_517
dtype: float64
- name: query_emb_518
dtype: float64
- name: query_emb_519
dtype: float64
- name: query_emb_520
dtype: float64
- name: query_emb_521
dtype: float64
- name: query_emb_522
dtype: float64
- name: query_emb_523
dtype: float64
- name: query_emb_524
dtype: float64
- name: query_emb_525
dtype: float64
- name: query_emb_526
dtype: float64
- name: query_emb_527
dtype: float64
- name: query_emb_528
dtype: float64
- name: query_emb_529
dtype: float64
- name: query_emb_530
dtype: float64
- name: query_emb_531
dtype: float64
- name: query_emb_532
dtype: float64
- name: query_emb_533
dtype: float64
- name: query_emb_534
dtype: float64
- name: query_emb_535
dtype: float64
- name: query_emb_536
dtype: float64
- name: query_emb_537
dtype: float64
- name: query_emb_538
dtype: float64
- name: query_emb_539
dtype: float64
- name: query_emb_540
dtype: float64
- name: query_emb_541
dtype: float64
- name: query_emb_542
dtype: float64
- name: query_emb_543
dtype: float64
- name: query_emb_544
dtype: float64
- name: query_emb_545
dtype: float64
- name: query_emb_546
dtype: float64
- name: query_emb_547
dtype: float64
- name: query_emb_548
dtype: float64
- name: query_emb_549
dtype: float64
- name: query_emb_550
dtype: float64
- name: query_emb_551
dtype: float64
- name: query_emb_552
dtype: float64
- name: query_emb_553
dtype: float64
- name: query_emb_554
dtype: float64
- name: query_emb_555
dtype: float64
- name: query_emb_556
dtype: float64
- name: query_emb_557
dtype: float64
- name: query_emb_558
dtype: float64
- name: query_emb_559
dtype: float64
- name: query_emb_560
dtype: float64
- name: query_emb_561
dtype: float64
- name: query_emb_562
dtype: float64
- name: query_emb_563
dtype: float64
- name: query_emb_564
dtype: float64
- name: query_emb_565
dtype: float64
- name: query_emb_566
dtype: float64
- name: query_emb_567
dtype: float64
- name: query_emb_568
dtype: float64
- name: query_emb_569
dtype: float64
- name: query_emb_570
dtype: float64
- name: query_emb_571
dtype: float64
- name: query_emb_572
dtype: float64
- name: query_emb_573
dtype: float64
- name: query_emb_574
dtype: float64
- name: query_emb_575
dtype: float64
- name: query_emb_576
dtype: float64
- name: query_emb_577
dtype: float64
- name: query_emb_578
dtype: float64
- name: query_emb_579
dtype: float64
- name: query_emb_580
dtype: float64
- name: query_emb_581
dtype: float64
- name: query_emb_582
dtype: float64
- name: query_emb_583
dtype: float64
- name: query_emb_584
dtype: float64
- name: query_emb_585
dtype: float64
- name: query_emb_586
dtype: float64
- name: query_emb_587
dtype: float64
- name: query_emb_588
dtype: float64
- name: query_emb_589
dtype: float64
- name: query_emb_590
dtype: float64
- name: query_emb_591
dtype: float64
- name: query_emb_592
dtype: float64
- name: query_emb_593
dtype: float64
- name: query_emb_594
dtype: float64
- name: query_emb_595
dtype: float64
- name: query_emb_596
dtype: float64
- name: query_emb_597
dtype: float64
- name: query_emb_598
dtype: float64
- name: query_emb_599
dtype: float64
- name: query_emb_600
dtype: float64
- name: query_emb_601
dtype: float64
- name: query_emb_602
dtype: float64
- name: query_emb_603
dtype: float64
- name: query_emb_604
dtype: float64
- name: query_emb_605
dtype: float64
- name: query_emb_606
dtype: float64
- name: query_emb_607
dtype: float64
- name: query_emb_608
dtype: float64
- name: query_emb_609
dtype: float64
- name: query_emb_610
dtype: float64
- name: query_emb_611
dtype: float64
- name: query_emb_612
dtype: float64
- name: query_emb_613
dtype: float64
- name: query_emb_614
dtype: float64
- name: query_emb_615
dtype: float64
- name: query_emb_616
dtype: float64
- name: query_emb_617
dtype: float64
- name: query_emb_618
dtype: float64
- name: query_emb_619
dtype: float64
- name: query_emb_620
dtype: float64
- name: query_emb_621
dtype: float64
- name: query_emb_622
dtype: float64
- name: query_emb_623
dtype: float64
- name: query_emb_624
dtype: float64
- name: query_emb_625
dtype: float64
- name: query_emb_626
dtype: float64
- name: query_emb_627
dtype: float64
- name: query_emb_628
dtype: float64
- name: query_emb_629
dtype: float64
- name: query_emb_630
dtype: float64
- name: query_emb_631
dtype: float64
- name: query_emb_632
dtype: float64
- name: query_emb_633
dtype: float64
- name: query_emb_634
dtype: float64
- name: query_emb_635
dtype: float64
- name: query_emb_636
dtype: float64
- name: query_emb_637
dtype: float64
- name: query_emb_638
dtype: float64
- name: query_emb_639
dtype: float64
- name: query_emb_640
dtype: float64
- name: query_emb_641
dtype: float64
- name: query_emb_642
dtype: float64
- name: query_emb_643
dtype: float64
- name: query_emb_644
dtype: float64
- name: query_emb_645
dtype: float64
- name: query_emb_646
dtype: float64
- name: query_emb_647
dtype: float64
- name: query_emb_648
dtype: float64
- name: query_emb_649
dtype: float64
- name: query_emb_650
dtype: float64
- name: query_emb_651
dtype: float64
- name: query_emb_652
dtype: float64
- name: query_emb_653
dtype: float64
- name: query_emb_654
dtype: float64
- name: query_emb_655
dtype: float64
- name: query_emb_656
dtype: float64
- name: query_emb_657
dtype: float64
- name: query_emb_658
dtype: float64
- name: query_emb_659
dtype: float64
- name: query_emb_660
dtype: float64
- name: query_emb_661
dtype: float64
- name: query_emb_662
dtype: float64
- name: query_emb_663
dtype: float64
- name: query_emb_664
dtype: float64
- name: query_emb_665
dtype: float64
- name: query_emb_666
dtype: float64
- name: query_emb_667
dtype: float64
- name: query_emb_668
dtype: float64
- name: query_emb_669
dtype: float64
- name: query_emb_670
dtype: float64
- name: query_emb_671
dtype: float64
- name: query_emb_672
dtype: float64
- name: query_emb_673
dtype: float64
- name: query_emb_674
dtype: float64
- name: query_emb_675
dtype: float64
- name: query_emb_676
dtype: float64
- name: query_emb_677
dtype: float64
- name: query_emb_678
dtype: float64
- name: query_emb_679
dtype: float64
- name: query_emb_680
dtype: float64
- name: query_emb_681
dtype: float64
- name: query_emb_682
dtype: float64
- name: query_emb_683
dtype: float64
- name: query_emb_684
dtype: float64
- name: query_emb_685
dtype: float64
- name: query_emb_686
dtype: float64
- name: query_emb_687
dtype: float64
- name: query_emb_688
dtype: float64
- name: query_emb_689
dtype: float64
- name: query_emb_690
dtype: float64
- name: query_emb_691
dtype: float64
- name: query_emb_692
dtype: float64
- name: query_emb_693
dtype: float64
- name: query_emb_694
dtype: float64
- name: query_emb_695
dtype: float64
- name: query_emb_696
dtype: float64
- name: query_emb_697
dtype: float64
- name: query_emb_698
dtype: float64
- name: query_emb_699
dtype: float64
- name: query_emb_700
dtype: float64
- name: query_emb_701
dtype: float64
- name: query_emb_702
dtype: float64
- name: query_emb_703
dtype: float64
- name: query_emb_704
dtype: float64
- name: query_emb_705
dtype: float64
- name: query_emb_706
dtype: float64
- name: query_emb_707
dtype: float64
- name: query_emb_708
dtype: float64
- name: query_emb_709
dtype: float64
- name: query_emb_710
dtype: float64
- name: query_emb_711
dtype: float64
- name: query_emb_712
dtype: float64
- name: query_emb_713
dtype: float64
- name: query_emb_714
dtype: float64
- name: query_emb_715
dtype: float64
- name: query_emb_716
dtype: float64
- name: query_emb_717
dtype: float64
- name: query_emb_718
dtype: float64
- name: query_emb_719
dtype: float64
- name: query_emb_720
dtype: float64
- name: query_emb_721
dtype: float64
- name: query_emb_722
dtype: float64
- name: query_emb_723
dtype: float64
- name: query_emb_724
dtype: float64
- name: query_emb_725
dtype: float64
- name: query_emb_726
dtype: float64
- name: query_emb_727
dtype: float64
- name: query_emb_728
dtype: float64
- name: query_emb_729
dtype: float64
- name: query_emb_730
dtype: float64
- name: query_emb_731
dtype: float64
- name: query_emb_732
dtype: float64
- name: query_emb_733
dtype: float64
- name: query_emb_734
dtype: float64
- name: query_emb_735
dtype: float64
- name: query_emb_736
dtype: float64
- name: query_emb_737
dtype: float64
- name: query_emb_738
dtype: float64
- name: query_emb_739
dtype: float64
- name: query_emb_740
dtype: float64
- name: query_emb_741
dtype: float64
- name: query_emb_742
dtype: float64
- name: query_emb_743
dtype: float64
- name: query_emb_744
dtype: float64
- name: query_emb_745
dtype: float64
- name: query_emb_746
dtype: float64
- name: query_emb_747
dtype: float64
- name: query_emb_748
dtype: float64
- name: query_emb_749
dtype: float64
- name: query_emb_750
dtype: float64
- name: query_emb_751
dtype: float64
- name: query_emb_752
dtype: float64
- name: query_emb_753
dtype: float64
- name: query_emb_754
dtype: float64
- name: query_emb_755
dtype: float64
- name: query_emb_756
dtype: float64
- name: query_emb_757
dtype: float64
- name: query_emb_758
dtype: float64
- name: query_emb_759
dtype: float64
- name: query_emb_760
dtype: float64
- name: query_emb_761
dtype: float64
- name: query_emb_762
dtype: float64
- name: query_emb_763
dtype: float64
- name: query_emb_764
dtype: float64
- name: query_emb_765
dtype: float64
- name: query_emb_766
dtype: float64
- name: query_emb_767
dtype: float64
- name: context_emb_0
dtype: float64
- name: context_emb_1
dtype: float64
- name: context_emb_2
dtype: float64
- name: context_emb_3
dtype: float64
- name: context_emb_4
dtype: float64
- name: context_emb_5
dtype: float64
- name: context_emb_6
dtype: float64
- name: context_emb_7
dtype: float64
- name: context_emb_8
dtype: float64
- name: context_emb_9
dtype: float64
- name: context_emb_10
dtype: float64
- name: context_emb_11
dtype: float64
- name: context_emb_12
dtype: float64
- name: context_emb_13
dtype: float64
- name: context_emb_14
dtype: float64
- name: context_emb_15
dtype: float64
- name: context_emb_16
dtype: float64
- name: context_emb_17
dtype: float64
- name: context_emb_18
dtype: float64
- name: context_emb_19
dtype: float64
- name: context_emb_20
dtype: float64
- name: context_emb_21
dtype: float64
- name: context_emb_22
dtype: float64
- name: context_emb_23
dtype: float64
- name: context_emb_24
dtype: float64
- name: context_emb_25
dtype: float64
- name: context_emb_26
dtype: float64
- name: context_emb_27
dtype: float64
- name: context_emb_28
dtype: float64
- name: context_emb_29
dtype: float64
- name: context_emb_30
dtype: float64
- name: context_emb_31
dtype: float64
- name: context_emb_32
dtype: float64
- name: context_emb_33
dtype: float64
- name: context_emb_34
dtype: float64
- name: context_emb_35
dtype: float64
- name: context_emb_36
dtype: float64
- name: context_emb_37
dtype: float64
- name: context_emb_38
dtype: float64
- name: context_emb_39
dtype: float64
- name: context_emb_40
dtype: float64
- name: context_emb_41
dtype: float64
- name: context_emb_42
dtype: float64
- name: context_emb_43
dtype: float64
- name: context_emb_44
dtype: float64
- name: context_emb_45
dtype: float64
- name: context_emb_46
dtype: float64
- name: context_emb_47
dtype: float64
- name: context_emb_48
dtype: float64
- name: context_emb_49
dtype: float64
- name: context_emb_50
dtype: float64
- name: context_emb_51
dtype: float64
- name: context_emb_52
dtype: float64
- name: context_emb_53
dtype: float64
- name: context_emb_54
dtype: float64
- name: context_emb_55
dtype: float64
- name: context_emb_56
dtype: float64
- name: context_emb_57
dtype: float64
- name: context_emb_58
dtype: float64
- name: context_emb_59
dtype: float64
- name: context_emb_60
dtype: float64
- name: context_emb_61
dtype: float64
- name: context_emb_62
dtype: float64
- name: context_emb_63
dtype: float64
- name: context_emb_64
dtype: float64
- name: context_emb_65
dtype: float64
- name: context_emb_66
dtype: float64
- name: context_emb_67
dtype: float64
- name: context_emb_68
dtype: float64
- name: context_emb_69
dtype: float64
- name: context_emb_70
dtype: float64
- name: context_emb_71
dtype: float64
- name: context_emb_72
dtype: float64
- name: context_emb_73
dtype: float64
- name: context_emb_74
dtype: float64
- name: context_emb_75
dtype: float64
- name: context_emb_76
dtype: float64
- name: context_emb_77
dtype: float64
- name: context_emb_78
dtype: float64
- name: context_emb_79
dtype: float64
- name: context_emb_80
dtype: float64
- name: context_emb_81
dtype: float64
- name: context_emb_82
dtype: float64
- name: context_emb_83
dtype: float64
- name: context_emb_84
dtype: float64
- name: context_emb_85
dtype: float64
- name: context_emb_86
dtype: float64
- name: context_emb_87
dtype: float64
- name: context_emb_88
dtype: float64
- name: context_emb_89
dtype: float64
- name: context_emb_90
dtype: float64
- name: context_emb_91
dtype: float64
- name: context_emb_92
dtype: float64
- name: context_emb_93
dtype: float64
- name: context_emb_94
dtype: float64
- name: context_emb_95
dtype: float64
- name: context_emb_96
dtype: float64
- name: context_emb_97
dtype: float64
- name: context_emb_98
dtype: float64
- name: context_emb_99
dtype: float64
- name: context_emb_100
dtype: float64
- name: context_emb_101
dtype: float64
- name: context_emb_102
dtype: float64
- name: context_emb_103
dtype: float64
- name: context_emb_104
dtype: float64
- name: context_emb_105
dtype: float64
- name: context_emb_106
dtype: float64
- name: context_emb_107
dtype: float64
- name: context_emb_108
dtype: float64
- name: context_emb_109
dtype: float64
- name: context_emb_110
dtype: float64
- name: context_emb_111
dtype: float64
- name: context_emb_112
dtype: float64
- name: context_emb_113
dtype: float64
- name: context_emb_114
dtype: float64
- name: context_emb_115
dtype: float64
- name: context_emb_116
dtype: float64
- name: context_emb_117
dtype: float64
- name: context_emb_118
dtype: float64
- name: context_emb_119
dtype: float64
- name: context_emb_120
dtype: float64
- name: context_emb_121
dtype: float64
- name: context_emb_122
dtype: float64
- name: context_emb_123
dtype: float64
- name: context_emb_124
dtype: float64
- name: context_emb_125
dtype: float64
- name: context_emb_126
dtype: float64
- name: context_emb_127
dtype: float64
- name: context_emb_128
dtype: float64
- name: context_emb_129
dtype: float64
- name: context_emb_130
dtype: float64
- name: context_emb_131
dtype: float64
- name: context_emb_132
dtype: float64
- name: context_emb_133
dtype: float64
- name: context_emb_134
dtype: float64
- name: context_emb_135
dtype: float64
- name: context_emb_136
dtype: float64
- name: context_emb_137
dtype: float64
- name: context_emb_138
dtype: float64
- name: context_emb_139
dtype: float64
- name: context_emb_140
dtype: float64
- name: context_emb_141
dtype: float64
- name: context_emb_142
dtype: float64
- name: context_emb_143
dtype: float64
- name: context_emb_144
dtype: float64
- name: context_emb_145
dtype: float64
- name: context_emb_146
dtype: float64
- name: context_emb_147
dtype: float64
- name: context_emb_148
dtype: float64
- name: context_emb_149
dtype: float64
- name: context_emb_150
dtype: float64
- name: context_emb_151
dtype: float64
- name: context_emb_152
dtype: float64
- name: context_emb_153
dtype: float64
- name: context_emb_154
dtype: float64
- name: context_emb_155
dtype: float64
- name: context_emb_156
dtype: float64
- name: context_emb_157
dtype: float64
- name: context_emb_158
dtype: float64
- name: context_emb_159
dtype: float64
- name: context_emb_160
dtype: float64
- name: context_emb_161
dtype: float64
- name: context_emb_162
dtype: float64
- name: context_emb_163
dtype: float64
- name: context_emb_164
dtype: float64
- name: context_emb_165
dtype: float64
- name: context_emb_166
dtype: float64
- name: context_emb_167
dtype: float64
- name: context_emb_168
dtype: float64
- name: context_emb_169
dtype: float64
- name: context_emb_170
dtype: float64
- name: context_emb_171
dtype: float64
- name: context_emb_172
dtype: float64
- name: context_emb_173
dtype: float64
- name: context_emb_174
dtype: float64
- name: context_emb_175
dtype: float64
- name: context_emb_176
dtype: float64
- name: context_emb_177
dtype: float64
- name: context_emb_178
dtype: float64
- name: context_emb_179
dtype: float64
- name: context_emb_180
dtype: float64
- name: context_emb_181
dtype: float64
- name: context_emb_182
dtype: float64
- name: context_emb_183
dtype: float64
- name: context_emb_184
dtype: float64
- name: context_emb_185
dtype: float64
- name: context_emb_186
dtype: float64
- name: context_emb_187
dtype: float64
- name: context_emb_188
dtype: float64
- name: context_emb_189
dtype: float64
- name: context_emb_190
dtype: float64
- name: context_emb_191
dtype: float64
- name: context_emb_192
dtype: float64
- name: context_emb_193
dtype: float64
- name: context_emb_194
dtype: float64
- name: context_emb_195
dtype: float64
- name: context_emb_196
dtype: float64
- name: context_emb_197
dtype: float64
- name: context_emb_198
dtype: float64
- name: context_emb_199
dtype: float64
- name: context_emb_200
dtype: float64
- name: context_emb_201
dtype: float64
- name: context_emb_202
dtype: float64
- name: context_emb_203
dtype: float64
- name: context_emb_204
dtype: float64
- name: context_emb_205
dtype: float64
- name: context_emb_206
dtype: float64
- name: context_emb_207
dtype: float64
- name: context_emb_208
dtype: float64
- name: context_emb_209
dtype: float64
- name: context_emb_210
dtype: float64
- name: context_emb_211
dtype: float64
- name: context_emb_212
dtype: float64
- name: context_emb_213
dtype: float64
- name: context_emb_214
dtype: float64
- name: context_emb_215
dtype: float64
- name: context_emb_216
dtype: float64
- name: context_emb_217
dtype: float64
- name: context_emb_218
dtype: float64
- name: context_emb_219
dtype: float64
- name: context_emb_220
dtype: float64
- name: context_emb_221
dtype: float64
- name: context_emb_222
dtype: float64
- name: context_emb_223
dtype: float64
- name: context_emb_224
dtype: float64
- name: context_emb_225
dtype: float64
- name: context_emb_226
dtype: float64
- name: context_emb_227
dtype: float64
- name: context_emb_228
dtype: float64
- name: context_emb_229
dtype: float64
- name: context_emb_230
dtype: float64
- name: context_emb_231
dtype: float64
- name: context_emb_232
dtype: float64
- name: context_emb_233
dtype: float64
- name: context_emb_234
dtype: float64
- name: context_emb_235
dtype: float64
- name: context_emb_236
dtype: float64
- name: context_emb_237
dtype: float64
- name: context_emb_238
dtype: float64
- name: context_emb_239
dtype: float64
- name: context_emb_240
dtype: float64
- name: context_emb_241
dtype: float64
- name: context_emb_242
dtype: float64
- name: context_emb_243
dtype: float64
- name: context_emb_244
dtype: float64
- name: context_emb_245
dtype: float64
- name: context_emb_246
dtype: float64
- name: context_emb_247
dtype: float64
- name: context_emb_248
dtype: float64
- name: context_emb_249
dtype: float64
- name: context_emb_250
dtype: float64
- name: context_emb_251
dtype: float64
- name: context_emb_252
dtype: float64
- name: context_emb_253
dtype: float64
- name: context_emb_254
dtype: float64
- name: context_emb_255
dtype: float64
- name: context_emb_256
dtype: float64
- name: context_emb_257
dtype: float64
- name: context_emb_258
dtype: float64
- name: context_emb_259
dtype: float64
- name: context_emb_260
dtype: float64
- name: context_emb_261
dtype: float64
- name: context_emb_262
dtype: float64
- name: context_emb_263
dtype: float64
- name: context_emb_264
dtype: float64
- name: context_emb_265
dtype: float64
- name: context_emb_266
dtype: float64
- name: context_emb_267
dtype: float64
- name: context_emb_268
dtype: float64
- name: context_emb_269
dtype: float64
- name: context_emb_270
dtype: float64
- name: context_emb_271
dtype: float64
- name: context_emb_272
dtype: float64
- name: context_emb_273
dtype: float64
- name: context_emb_274
dtype: float64
- name: context_emb_275
dtype: float64
- name: context_emb_276
dtype: float64
- name: context_emb_277
dtype: float64
- name: context_emb_278
dtype: float64
- name: context_emb_279
dtype: float64
- name: context_emb_280
dtype: float64
- name: context_emb_281
dtype: float64
- name: context_emb_282
dtype: float64
- name: context_emb_283
dtype: float64
- name: context_emb_284
dtype: float64
- name: context_emb_285
dtype: float64
- name: context_emb_286
dtype: float64
- name: context_emb_287
dtype: float64
- name: context_emb_288
dtype: float64
- name: context_emb_289
dtype: float64
- name: context_emb_290
dtype: float64
- name: context_emb_291
dtype: float64
- name: context_emb_292
dtype: float64
- name: context_emb_293
dtype: float64
- name: context_emb_294
dtype: float64
- name: context_emb_295
dtype: float64
- name: context_emb_296
dtype: float64
- name: context_emb_297
dtype: float64
- name: context_emb_298
dtype: float64
- name: context_emb_299
dtype: float64
- name: context_emb_300
dtype: float64
- name: context_emb_301
dtype: float64
- name: context_emb_302
dtype: float64
- name: context_emb_303
dtype: float64
- name: context_emb_304
dtype: float64
- name: context_emb_305
dtype: float64
- name: context_emb_306
dtype: float64
- name: context_emb_307
dtype: float64
- name: context_emb_308
dtype: float64
- name: context_emb_309
dtype: float64
- name: context_emb_310
dtype: float64
- name: context_emb_311
dtype: float64
- name: context_emb_312
dtype: float64
- name: context_emb_313
dtype: float64
- name: context_emb_314
dtype: float64
- name: context_emb_315
dtype: float64
- name: context_emb_316
dtype: float64
- name: context_emb_317
dtype: float64
- name: context_emb_318
dtype: float64
- name: context_emb_319
dtype: float64
- name: context_emb_320
dtype: float64
- name: context_emb_321
dtype: float64
- name: context_emb_322
dtype: float64
- name: context_emb_323
dtype: float64
- name: context_emb_324
dtype: float64
- name: context_emb_325
dtype: float64
- name: context_emb_326
dtype: float64
- name: context_emb_327
dtype: float64
- name: context_emb_328
dtype: float64
- name: context_emb_329
dtype: float64
- name: context_emb_330
dtype: float64
- name: context_emb_331
dtype: float64
- name: context_emb_332
dtype: float64
- name: context_emb_333
dtype: float64
- name: context_emb_334
dtype: float64
- name: context_emb_335
dtype: float64
- name: context_emb_336
dtype: float64
- name: context_emb_337
dtype: float64
- name: context_emb_338
dtype: float64
- name: context_emb_339
dtype: float64
- name: context_emb_340
dtype: float64
- name: context_emb_341
dtype: float64
- name: context_emb_342
dtype: float64
- name: context_emb_343
dtype: float64
- name: context_emb_344
dtype: float64
- name: context_emb_345
dtype: float64
- name: context_emb_346
dtype: float64
- name: context_emb_347
dtype: float64
- name: context_emb_348
dtype: float64
- name: context_emb_349
dtype: float64
- name: context_emb_350
dtype: float64
- name: context_emb_351
dtype: float64
- name: context_emb_352
dtype: float64
- name: context_emb_353
dtype: float64
- name: context_emb_354
dtype: float64
- name: context_emb_355
dtype: float64
- name: context_emb_356
dtype: float64
- name: context_emb_357
dtype: float64
- name: context_emb_358
dtype: float64
- name: context_emb_359
dtype: float64
- name: context_emb_360
dtype: float64
- name: context_emb_361
dtype: float64
- name: context_emb_362
dtype: float64
- name: context_emb_363
dtype: float64
- name: context_emb_364
dtype: float64
- name: context_emb_365
dtype: float64
- name: context_emb_366
dtype: float64
- name: context_emb_367
dtype: float64
- name: context_emb_368
dtype: float64
- name: context_emb_369
dtype: float64
- name: context_emb_370
dtype: float64
- name: context_emb_371
dtype: float64
- name: context_emb_372
dtype: float64
- name: context_emb_373
dtype: float64
- name: context_emb_374
dtype: float64
- name: context_emb_375
dtype: float64
- name: context_emb_376
dtype: float64
- name: context_emb_377
dtype: float64
- name: context_emb_378
dtype: float64
- name: context_emb_379
dtype: float64
- name: context_emb_380
dtype: float64
- name: context_emb_381
dtype: float64
- name: context_emb_382
dtype: float64
- name: context_emb_383
dtype: float64
- name: context_emb_384
dtype: float64
- name: context_emb_385
dtype: float64
- name: context_emb_386
dtype: float64
- name: context_emb_387
dtype: float64
- name: context_emb_388
dtype: float64
- name: context_emb_389
dtype: float64
- name: context_emb_390
dtype: float64
- name: context_emb_391
dtype: float64
- name: context_emb_392
dtype: float64
- name: context_emb_393
dtype: float64
- name: context_emb_394
dtype: float64
- name: context_emb_395
dtype: float64
- name: context_emb_396
dtype: float64
- name: context_emb_397
dtype: float64
- name: context_emb_398
dtype: float64
- name: context_emb_399
dtype: float64
- name: context_emb_400
dtype: float64
- name: context_emb_401
dtype: float64
- name: context_emb_402
dtype: float64
- name: context_emb_403
dtype: float64
- name: context_emb_404
dtype: float64
- name: context_emb_405
dtype: float64
- name: context_emb_406
dtype: float64
- name: context_emb_407
dtype: float64
- name: context_emb_408
dtype: float64
- name: context_emb_409
dtype: float64
- name: context_emb_410
dtype: float64
- name: context_emb_411
dtype: float64
- name: context_emb_412
dtype: float64
- name: context_emb_413
dtype: float64
- name: context_emb_414
dtype: float64
- name: context_emb_415
dtype: float64
- name: context_emb_416
dtype: float64
- name: context_emb_417
dtype: float64
- name: context_emb_418
dtype: float64
- name: context_emb_419
dtype: float64
- name: context_emb_420
dtype: float64
- name: context_emb_421
dtype: float64
- name: context_emb_422
dtype: float64
- name: context_emb_423
dtype: float64
- name: context_emb_424
dtype: float64
- name: context_emb_425
dtype: float64
- name: context_emb_426
dtype: float64
- name: context_emb_427
dtype: float64
- name: context_emb_428
dtype: float64
- name: context_emb_429
dtype: float64
- name: context_emb_430
dtype: float64
- name: context_emb_431
dtype: float64
- name: context_emb_432
dtype: float64
- name: context_emb_433
dtype: float64
- name: context_emb_434
dtype: float64
- name: context_emb_435
dtype: float64
- name: context_emb_436
dtype: float64
- name: context_emb_437
dtype: float64
- name: context_emb_438
dtype: float64
- name: context_emb_439
dtype: float64
- name: context_emb_440
dtype: float64
- name: context_emb_441
dtype: float64
- name: context_emb_442
dtype: float64
- name: context_emb_443
dtype: float64
- name: context_emb_444
dtype: float64
- name: context_emb_445
dtype: float64
- name: context_emb_446
dtype: float64
- name: context_emb_447
dtype: float64
- name: context_emb_448
dtype: float64
- name: context_emb_449
dtype: float64
- name: context_emb_450
dtype: float64
- name: context_emb_451
dtype: float64
- name: context_emb_452
dtype: float64
- name: context_emb_453
dtype: float64
- name: context_emb_454
dtype: float64
- name: context_emb_455
dtype: float64
- name: context_emb_456
dtype: float64
- name: context_emb_457
dtype: float64
- name: context_emb_458
dtype: float64
- name: context_emb_459
dtype: float64
- name: context_emb_460
dtype: float64
- name: context_emb_461
dtype: float64
- name: context_emb_462
dtype: float64
- name: context_emb_463
dtype: float64
- name: context_emb_464
dtype: float64
- name: context_emb_465
dtype: float64
- name: context_emb_466
dtype: float64
- name: context_emb_467
dtype: float64
- name: context_emb_468
dtype: float64
- name: context_emb_469
dtype: float64
- name: context_emb_470
dtype: float64
- name: context_emb_471
dtype: float64
- name: context_emb_472
dtype: float64
- name: context_emb_473
dtype: float64
- name: context_emb_474
dtype: float64
- name: context_emb_475
dtype: float64
- name: context_emb_476
dtype: float64
- name: context_emb_477
dtype: float64
- name: context_emb_478
dtype: float64
- name: context_emb_479
dtype: float64
- name: context_emb_480
dtype: float64
- name: context_emb_481
dtype: float64
- name: context_emb_482
dtype: float64
- name: context_emb_483
dtype: float64
- name: context_emb_484
dtype: float64
- name: context_emb_485
dtype: float64
- name: context_emb_486
dtype: float64
- name: context_emb_487
dtype: float64
- name: context_emb_488
dtype: float64
- name: context_emb_489
dtype: float64
- name: context_emb_490
dtype: float64
- name: context_emb_491
dtype: float64
- name: context_emb_492
dtype: float64
- name: context_emb_493
dtype: float64
- name: context_emb_494
dtype: float64
- name: context_emb_495
dtype: float64
- name: context_emb_496
dtype: float64
- name: context_emb_497
dtype: float64
- name: context_emb_498
dtype: float64
- name: context_emb_499
dtype: float64
- name: context_emb_500
dtype: float64
- name: context_emb_501
dtype: float64
- name: context_emb_502
dtype: float64
- name: context_emb_503
dtype: float64
- name: context_emb_504
dtype: float64
- name: context_emb_505
dtype: float64
- name: context_emb_506
dtype: float64
- name: context_emb_507
dtype: float64
- name: context_emb_508
dtype: float64
- name: context_emb_509
dtype: float64
- name: context_emb_510
dtype: float64
- name: context_emb_511
dtype: float64
- name: context_emb_512
dtype: float64
- name: context_emb_513
dtype: float64
- name: context_emb_514
dtype: float64
- name: context_emb_515
dtype: float64
- name: context_emb_516
dtype: float64
- name: context_emb_517
dtype: float64
- name: context_emb_518
dtype: float64
- name: context_emb_519
dtype: float64
- name: context_emb_520
dtype: float64
- name: context_emb_521
dtype: float64
- name: context_emb_522
dtype: float64
- name: context_emb_523
dtype: float64
- name: context_emb_524
dtype: float64
- name: context_emb_525
dtype: float64
- name: context_emb_526
dtype: float64
- name: context_emb_527
dtype: float64
- name: context_emb_528
dtype: float64
- name: context_emb_529
dtype: float64
- name: context_emb_530
dtype: float64
- name: context_emb_531
dtype: float64
- name: context_emb_532
dtype: float64
- name: context_emb_533
dtype: float64
- name: context_emb_534
dtype: float64
- name: context_emb_535
dtype: float64
- name: context_emb_536
dtype: float64
- name: context_emb_537
dtype: float64
- name: context_emb_538
dtype: float64
- name: context_emb_539
dtype: float64
- name: context_emb_540
dtype: float64
- name: context_emb_541
dtype: float64
- name: context_emb_542
dtype: float64
- name: context_emb_543
dtype: float64
- name: context_emb_544
dtype: float64
- name: context_emb_545
dtype: float64
- name: context_emb_546
dtype: float64
- name: context_emb_547
dtype: float64
- name: context_emb_548
dtype: float64
- name: context_emb_549
dtype: float64
- name: context_emb_550
dtype: float64
- name: context_emb_551
dtype: float64
- name: context_emb_552
dtype: float64
- name: context_emb_553
dtype: float64
- name: context_emb_554
dtype: float64
- name: context_emb_555
dtype: float64
- name: context_emb_556
dtype: float64
- name: context_emb_557
dtype: float64
- name: context_emb_558
dtype: float64
- name: context_emb_559
dtype: float64
- name: context_emb_560
dtype: float64
- name: context_emb_561
dtype: float64
- name: context_emb_562
dtype: float64
- name: context_emb_563
dtype: float64
- name: context_emb_564
dtype: float64
- name: context_emb_565
dtype: float64
- name: context_emb_566
dtype: float64
- name: context_emb_567
dtype: float64
- name: context_emb_568
dtype: float64
- name: context_emb_569
dtype: float64
- name: context_emb_570
dtype: float64
- name: context_emb_571
dtype: float64
- name: context_emb_572
dtype: float64
- name: context_emb_573
dtype: float64
- name: context_emb_574
dtype: float64
- name: context_emb_575
dtype: float64
- name: context_emb_576
dtype: float64
- name: context_emb_577
dtype: float64
- name: context_emb_578
dtype: float64
- name: context_emb_579
dtype: float64
- name: context_emb_580
dtype: float64
- name: context_emb_581
dtype: float64
- name: context_emb_582
dtype: float64
- name: context_emb_583
dtype: float64
- name: context_emb_584
dtype: float64
- name: context_emb_585
dtype: float64
- name: context_emb_586
dtype: float64
- name: context_emb_587
dtype: float64
- name: context_emb_588
dtype: float64
- name: context_emb_589
dtype: float64
- name: context_emb_590
dtype: float64
- name: context_emb_591
dtype: float64
- name: context_emb_592
dtype: float64
- name: context_emb_593
dtype: float64
- name: context_emb_594
dtype: float64
- name: context_emb_595
dtype: float64
- name: context_emb_596
dtype: float64
- name: context_emb_597
dtype: float64
- name: context_emb_598
dtype: float64
- name: context_emb_599
dtype: float64
- name: context_emb_600
dtype: float64
- name: context_emb_601
dtype: float64
- name: context_emb_602
dtype: float64
- name: context_emb_603
dtype: float64
- name: context_emb_604
dtype: float64
- name: context_emb_605
dtype: float64
- name: context_emb_606
dtype: float64
- name: context_emb_607
dtype: float64
- name: context_emb_608
dtype: float64
- name: context_emb_609
dtype: float64
- name: context_emb_610
dtype: float64
- name: context_emb_611
dtype: float64
- name: context_emb_612
dtype: float64
- name: context_emb_613
dtype: float64
- name: context_emb_614
dtype: float64
- name: context_emb_615
dtype: float64
- name: context_emb_616
dtype: float64
- name: context_emb_617
dtype: float64
- name: context_emb_618
dtype: float64
- name: context_emb_619
dtype: float64
- name: context_emb_620
dtype: float64
- name: context_emb_621
dtype: float64
- name: context_emb_622
dtype: float64
- name: context_emb_623
dtype: float64
- name: context_emb_624
dtype: float64
- name: context_emb_625
dtype: float64
- name: context_emb_626
dtype: float64
- name: context_emb_627
dtype: float64
- name: context_emb_628
dtype: float64
- name: context_emb_629
dtype: float64
- name: context_emb_630
dtype: float64
- name: context_emb_631
dtype: float64
- name: context_emb_632
dtype: float64
- name: context_emb_633
dtype: float64
- name: context_emb_634
dtype: float64
- name: context_emb_635
dtype: float64
- name: context_emb_636
dtype: float64
- name: context_emb_637
dtype: float64
- name: context_emb_638
dtype: float64
- name: context_emb_639
dtype: float64
- name: context_emb_640
dtype: float64
- name: context_emb_641
dtype: float64
- name: context_emb_642
dtype: float64
- name: context_emb_643
dtype: float64
- name: context_emb_644
dtype: float64
- name: context_emb_645
dtype: float64
- name: context_emb_646
dtype: float64
- name: context_emb_647
dtype: float64
- name: context_emb_648
dtype: float64
- name: context_emb_649
dtype: float64
- name: context_emb_650
dtype: float64
- name: context_emb_651
dtype: float64
- name: context_emb_652
dtype: float64
- name: context_emb_653
dtype: float64
- name: context_emb_654
dtype: float64
- name: context_emb_655
dtype: float64
- name: context_emb_656
dtype: float64
- name: context_emb_657
dtype: float64
- name: context_emb_658
dtype: float64
- name: context_emb_659
dtype: float64
- name: context_emb_660
dtype: float64
- name: context_emb_661
dtype: float64
- name: context_emb_662
dtype: float64
- name: context_emb_663
dtype: float64
- name: context_emb_664
dtype: float64
- name: context_emb_665
dtype: float64
- name: context_emb_666
dtype: float64
- name: context_emb_667
dtype: float64
- name: context_emb_668
dtype: float64
- name: context_emb_669
dtype: float64
- name: context_emb_670
dtype: float64
- name: context_emb_671
dtype: float64
- name: context_emb_672
dtype: float64
- name: context_emb_673
dtype: float64
- name: context_emb_674
dtype: float64
- name: context_emb_675
dtype: float64
- name: context_emb_676
dtype: float64
- name: context_emb_677
dtype: float64
- name: context_emb_678
dtype: float64
- name: context_emb_679
dtype: float64
- name: context_emb_680
dtype: float64
- name: context_emb_681
dtype: float64
- name: context_emb_682
dtype: float64
- name: context_emb_683
dtype: float64
- name: context_emb_684
dtype: float64
- name: context_emb_685
dtype: float64
- name: context_emb_686
dtype: float64
- name: context_emb_687
dtype: float64
- name: context_emb_688
dtype: float64
- name: context_emb_689
dtype: float64
- name: context_emb_690
dtype: float64
- name: context_emb_691
dtype: float64
- name: context_emb_692
dtype: float64
- name: context_emb_693
dtype: float64
- name: context_emb_694
dtype: float64
- name: context_emb_695
dtype: float64
- name: context_emb_696
dtype: float64
- name: context_emb_697
dtype: float64
- name: context_emb_698
dtype: float64
- name: context_emb_699
dtype: float64
- name: context_emb_700
dtype: float64
- name: context_emb_701
dtype: float64
- name: context_emb_702
dtype: float64
- name: context_emb_703
dtype: float64
- name: context_emb_704
dtype: float64
- name: context_emb_705
dtype: float64
- name: context_emb_706
dtype: float64
- name: context_emb_707
dtype: float64
- name: context_emb_708
dtype: float64
- name: context_emb_709
dtype: float64
- name: context_emb_710
dtype: float64
- name: context_emb_711
dtype: float64
- name: context_emb_712
dtype: float64
- name: context_emb_713
dtype: float64
- name: context_emb_714
dtype: float64
- name: context_emb_715
dtype: float64
- name: context_emb_716
dtype: float64
- name: context_emb_717
dtype: float64
- name: context_emb_718
dtype: float64
- name: context_emb_719
dtype: float64
- name: context_emb_720
dtype: float64
- name: context_emb_721
dtype: float64
- name: context_emb_722
dtype: float64
- name: context_emb_723
dtype: float64
- name: context_emb_724
dtype: float64
- name: context_emb_725
dtype: float64
- name: context_emb_726
dtype: float64
- name: context_emb_727
dtype: float64
- name: context_emb_728
dtype: float64
- name: context_emb_729
dtype: float64
- name: context_emb_730
dtype: float64
- name: context_emb_731
dtype: float64
- name: context_emb_732
dtype: float64
- name: context_emb_733
dtype: float64
- name: context_emb_734
dtype: float64
- name: context_emb_735
dtype: float64
- name: context_emb_736
dtype: float64
- name: context_emb_737
dtype: float64
- name: context_emb_738
dtype: float64
- name: context_emb_739
dtype: float64
- name: context_emb_740
dtype: float64
- name: context_emb_741
dtype: float64
- name: context_emb_742
dtype: float64
- name: context_emb_743
dtype: float64
- name: context_emb_744
dtype: float64
- name: context_emb_745
dtype: float64
- name: context_emb_746
dtype: float64
- name: context_emb_747
dtype: float64
- name: context_emb_748
dtype: float64
- name: context_emb_749
dtype: float64
- name: context_emb_750
dtype: float64
- name: context_emb_751
dtype: float64
- name: context_emb_752
dtype: float64
- name: context_emb_753
dtype: float64
- name: context_emb_754
dtype: float64
- name: context_emb_755
dtype: float64
- name: context_emb_756
dtype: float64
- name: context_emb_757
dtype: float64
- name: context_emb_758
dtype: float64
- name: context_emb_759
dtype: float64
- name: context_emb_760
dtype: float64
- name: context_emb_761
dtype: float64
- name: context_emb_762
dtype: float64
- name: context_emb_763
dtype: float64
- name: context_emb_764
dtype: float64
- name: context_emb_765
dtype: float64
- name: context_emb_766
dtype: float64
- name: context_emb_767
dtype: float64
- name: bm25_score
dtype: float64
- name: cos_sim_score
dtype: float64
- name: dotp_sim_score
dtype: float64
- name: meta_bm25_score
dtype: float64
- name: meta_cos_sim_score
dtype: float64
- name: meta_dotp_sim_score
dtype: float64
- name: summarized_bm25_score
dtype: float64
- name: summarized_cos_sim_score
dtype: float64
- name: summarized_dotp_sim_score
dtype: float64
- name: label
dtype: float64
splits:
- name: train
num_bytes: 144829280
num_examples: 11710
download_size: 35842316
dataset_size: 144829280
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sayan1101/testing | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 51569
num_examples: 164
- name: test
num_bytes: 12203
num_examples: 41
download_size: 37324
dataset_size: 63772
---
# Dataset Card for "testing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b | ---
pretty_name: Evaluation run of Neko-Institute-of-Science/pygmalion-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Neko-Institute-of-Science/pygmalion-7b](https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T20:45:41.888775](https://huggingface.co/datasets/open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b/blob/main/results_2023-09-22T20-45-41.888775.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788268467,\n \"f1\": 0.057855494966443086,\n\
\ \"f1_stderr\": 0.0013312169448543882,\n \"acc\": 0.3842127655245746,\n\
\ \"acc_stderr\": 0.009186954923281733\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268467,\n\
\ \"f1\": 0.057855494966443086,\n \"f1_stderr\": 0.0013312169448543882\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04624715693707354,\n \
\ \"acc_stderr\": 0.0057849916626918655\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871601\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T20_45_41.888775
path:
- '**/details_harness|drop|3_2023-09-22T20-45-41.888775.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T20-45-41.888775.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T20_45_41.888775
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-45-41.888775.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-45-41.888775.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:16:07.141450.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:16:07.141450.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T20_45_41.888775
path:
- '**/details_harness|winogrande|5_2023-09-22T20-45-41.888775.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T20-45-41.888775.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_16_07.141450
path:
- results_2023-07-19T16:16:07.141450.parquet
- split: 2023_09_22T20_45_41.888775
path:
- results_2023-09-22T20-45-41.888775.parquet
- split: latest
path:
- results_2023-09-22T20-45-41.888775.parquet
---
# Dataset Card for Evaluation run of Neko-Institute-of-Science/pygmalion-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Neko-Institute-of-Science/pygmalion-7b](https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:45:41.888775](https://huggingface.co/datasets/open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b/blob/main/results_2023-09-22T20-45-41.888775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.057855494966443086,
"f1_stderr": 0.0013312169448543882,
"acc": 0.3842127655245746,
"acc_stderr": 0.009186954923281733
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.057855494966443086,
"f1_stderr": 0.0013312169448543882
},
"harness|gsm8k|5": {
"acc": 0.04624715693707354,
"acc_stderr": 0.0057849916626918655
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871601
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kita_ikuyo_bocchitherock | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kita Ikuyo
This is the dataset of Kita Ikuyo, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 650 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 650 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 650 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 650 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
cakiki/arxiv-metadata | ---
license: cc0-1.0
---
|
MARIO-Math-Reasoning/Gaokao2023-Math-En | ---
license: mit
---
# Data Summary
This is a compilation of math test questions and answers drawn from the 2023 Chinese National College Entrance Examination, the 2023 American Mathematics Competitions, and the 2023 American College Testing. For simplicity, we refer to it as `Gaokao2023`.
|
CognitiveScience/csdhdata | ---
license: mit
---
|
celsowm/medicamentos_patologia_ner | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: tags
sequence: string
splits:
- name: train
num_bytes: 50246
num_examples: 273
download_size: 10597
dataset_size: 50246
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-Tristan__zero_shot_classification_test-fb99e6e4-4634 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Tristan/zero_shot_classification_test
eval_info:
task: zero_shot_classification
model: facebook/opt-125m
metrics: []
dataset_name: Tristan/zero_shot_classification_test
dataset_config: Tristan--zero_shot_classification_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-125m
* Dataset: Tristan/zero_shot_classification_test
* Config: Tristan--zero_shot_classification_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Tristan](https://huggingface.co/Tristan) for evaluating this model. |
pharaouk/Feedback-Collection | ---
license: cc-by-4.0
task_categories:
- text-generation
- text-classification
language:
- en
size_categories:
- 10K<n<100K
configs:
- config_name: default
data_files:
- split: train
path: "new_feedback_collection.json"
---
## Dataset Description
- **Homepage:https://github.com/kaistAI/Prometheus**
- **Repository:https://github.com/kaistAI/Prometheus**
- **Paper:https://arxiv.org/abs/2310.08491**
- **Point of Contact:seungone@kaist.ac.kr**
# Dataset Card
### Dataset Summary
The Feedback Collection is a dataset designed to induce fine-grained evaluation capabilities into language models.\\

Recently, proprietary LLMs (e.g., GPT-4) have been used to evaluate long-form responses. In our experiments, we found that open-source LMs are not capable of evaluating long-form responses, showing low correlation with both human evaluators and GPT-4.\\
In our paper, we found that by (1) fine-tuning feedback generated by GPT-4 and (2) including the appropriate reference materials (reference answers & score rubrics), we can effectively induce fine-grained evaluation into open-source LMs.
The Feedback Collection provides 1K score rubrics, 20K instructions & reference answers, 100K responses & feedback (20K for each score in the range 1-5).\\
Experimental results show that Prometheus (a LM obtained by fine-tuning Llama-2-Chat on the Feedback Collection) can function as an evaluator in both an absolute scoring setting and a ranking scoring setting.
### Languages
English
## Dataset Structure
* instruction: The input that is given to the evaluator LM. It includes the instruction & response to evaluate, the reference answer, the score rubric.
* output: The output that the evaluator LM should generate. It includes the feedback and score decision divided by a phrase ```[RESULT]```.
* orig```_```instruction: The instruction to be evaluated. Note that this differs with the instruction that includes all the components.
* orig```_```response: The response to be evaluated.
* orig```_```reference```_```answer: A reference answer to the orig```_```instruction.
* orig```_```criteria: The score criteria used to evaluate the orig```_``` response.
* orig```_```score1```_```description: A description of when to give a score of 1 to the orig```_```response.
* orig```_```score2```_```description: A description of when to give a score of 2 to the orig```_```response.
* orig```_```score3```_```description: A description of when to give a score of 3 to the orig```_```response.
* orig```_```score4```_```description: A description of when to give a score of 4 to the orig```_```response.
* orig```_```score5```_```description: A description of when to give a score of 5 to the orig```_```response.
* orig```_```feedback: A feedback that critiques the orig```_```response.
* orig```_```score: An integer between 1 and 5 given to the orig```_```response.
In our paper, we trained the input using the following prompt format (already processed in the 'instruction'):
```
###Task Description:
An instruction (might include an Input inside it), a response to evaluate, a reference answer that gets a score of 5, and a score rubric representing a evaluation criteria are given.
1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general.
2. After writing a feedback, write a score that is an integer between 1 and 5. You should refer to the score rubric.
3. The output format should look as follows: \"Feedback: (write a feedback for criteria) [RESULT] (an integer number between 1 and 5)\"
4. Please do not generate any other opening, closing, and explanations.
###The instruction to evaluate:
{orig_instruction}
###Response to evaluate:
{orig_response}
###Reference Answer (Score 5):
{orig_reference_answer}
###Score Rubrics:
[{orig_criteria}]
Score 1: {orig_score1_description}
Score 2: {orig_score2_description}
Score 3: {orig_score3_description}
Score 4: {orig_score4_description}
Score 5: {orig_score5_description}
###Feedback:
```
The following prompt format (already processed in the 'output') was used to train the evaluator LM:
```
{orig_feedback}
[RESULT] {orig_score}
```
Then during evaluation, we parsed the prediction after the phrase ```[RESULT]```.
### Data Splits
| name | train |
|-------------------|------:|
|Feedback-Collection|99,952|
### Citation Information
If you find the following model helpful, please consider citing our paper!
```bibtex
@misc{kim2023prometheus,
title={Prometheus: Inducing Fine-grained Evaluation Capability in Language Models},
author={Seungone Kim and Jamin Shin and Yejin Cho and Joel Jang and Shayne Longpre and Hwaran Lee and Sangdoo Yun and Seongjin Shin and Sungdong Kim and James Thorne and Minjoon Seo},
year={2023},
eprint={2310.08491},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
dichmau/ja_vi_translation | ---
task_categories:
- translation
language:
- ja
- vi
---
Japanese-Vietnamese Translated Sentence Pairs. |
TopicNet/RuWiki-Good | ---
language:
- ru
multilinguality:
- monolingual
license: other
license_name: topicnet
license_link: >-
https://github.com/machine-intelligence-laboratory/TopicNet/blob/master/LICENSE.txt
task_categories:
- text-classification
task_ids:
- topic-classification
- multi-class-classification
- multi-label-classification
tags:
- topic-modeling
- topic-modelling
- text-clustering
- multimodal-data
- multimodal-learning
- modalities
- document-representation
---
# RuWiki-Good
Selected articles from Russian Wikipedia dump (date `20200301`). The dataset consists of good ("добротные", "хорошие") and featured ("избранные") articles (see https://ru.wikipedia.org/wiki/Википедия:Добротные_статьи).
Some measurable characteristics of the dataset:
* D — number of documents
* <modality name> W — modality dictionary size (number of unique tokens)
* <modality name> len D — average document length in modality tokens (number of tokens)
* <modality name> len D uniq — average document length in unique modality tokens (number of unique tokens)
| | D | @categories W | @categories len D | @categories len D uniq | @lemmatized W | @lemmatized len D | @lemmatized len D uniq | @ngramms W | @ngramms len D | @ngramms len D uniq |
|:------|------------:|-----------------------:|---------------------------:|--------------------------------:|-----------------------:|---------------------------:|--------------------------------:|--------------------:|------------------------:|-----------------------------:|
| value | 8603 | 44493 | 5.1718 | 5.17296 | 1.66434e+07 | 1934.6 | 721.389 | 3.11727e+06 | 362.347 | 287.695 |
Information about document lengths in modality tokens:
| | len_total@categories | len_total@lemmatized | len_total@ngramms | len_uniq@categories | len_uniq@lemmatized | len_uniq@ngramms |
|:-----|-----------------------:|-----------------------:|--------------------:|----------------------:|----------------------:|-------------------:|
| mean | 5.17296 | 1934.6 | 362.347 | 5.17296 | 721.389 | 287.695 |
| std | 4.0102 | 2065.78 | 384.164 | 4.0102 | 596.947 | 292.717 |
| min | 0 | 55 | 8 | 0 | 35 | 8 |
| 25% | 2 | 593 | 112.5 | 2 | 296 | 96 |
| 50% | 4 | 1223 | 229 | 4 | 540 | 189 |
| 75% | 7 | 2447 | 458 | 7 | 943 | 365 |
| max | 51 | 17472 | 3335 | 51 | 4423 | 2459 |
|
mucai/ViP-Bench | ---
license: apache-2.0
---
<p align="center">
<img src="https://raw.githubusercontent.com/mu-cai/ViP-LLaVA/main/images/vip-llava_arch.png" width="600"> <br>
</p>
# [ViP-Bench: Making Large Multimodal Models Understand Arbitrary Visual Prompts](https://vip-llava.github.io/)
ViP-Bench a region level multimodal model evaulation benchmark curated by University of Wisconsin-Madison. We provides two kinds of visual prompts: (1) bounding boxes, and (2) human drawn diverse visual prompts.
**Evaluation Code** See [https://github.com/mu-cai/ViP-LLaVA/blob/main/docs/Evaluation.md](https://github.com/mu-cai/ViP-LLaVA/blob/main/docs/Evaluation.md)
**LeaderBoard** See [https://paperswithcode.com/sota/visual-question-answering-on-vip-bench](https://paperswithcode.com/sota/visual-question-answering-on-vip-bench)
**Evaluation Server** Please refer to [https://huggingface.co/spaces/mucai/ViP-Bench_Evaluator](https://huggingface.co/spaces/mucai/ViP-Bench_Evaluator) to use our evaluation server.
## Source annotation
In `source_image`, we provide the source plain images along with the bounding box/mask annotations. Researchers can use such grounding information to match the special tokens such as `<obj>` in `"question"` entry of `vip-bench-meta-data.json`. For example, `<obj>` can be replaced by textual coordinates to evaluate the region-level multimodal models. |
Norod78/simpsons-blip-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 51605730.0
num_examples: 755
download_size: 50553165
dataset_size: 51605730.0
pretty_name: 'Simpsons BLIP captions'
size_categories:
- n<1K
tags: []
task_categories:
- text-to-image
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
---
# Dataset Card for "simpsons-blip-captions"
|
AustinMcMike/steve_jobs_conversational | ---
license: apache-2.0
---
|
Bingsu/st-parallel-sentences | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: en
dtype: string
- name: other
dtype: string
splits:
- name: train
num_bytes: 35774892810
num_examples: 257055413
download_size: 22222052417
dataset_size: 35774892810
---
# Dataset Card for "st-parallel-sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
C-MTEB/CLSClusteringP2P | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: sentences
sequence: string
- name: labels
sequence: string
splits:
- name: test
num_bytes: 56780231
num_examples: 10
download_size: 37254736
dataset_size: 56780231
---
# Dataset Card for "CLSClusteringP2P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shunk031/wrime | ---
annotations_creators:
- crowdsourced
language:
- ja
language_creators:
- crowdsourced
license:
- unknown
multilinguality:
- monolingual
pretty_name: wrime
tags:
- sentiment-analysis
- wrime
task_categories:
- text-classification
task_ids:
- sentiment-classification
datasets:
- ver1
- ver2
metrics:
- accuracy
---
# Dataset Card for WRIME
[](https://github.com/shunk031/huggingface-datasets_wrime/actions/workflows/ci.yaml)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- Homepage: https://github.com/ids-cv/wrime
- Repository: https://github.com/shunk031/huggingface-datasets_wrime
- Paper: https://aclanthology.org/2021.naacl-main.169/
### Dataset Summary
In this study, we introduce a new dataset, WRIME, for emotional intensity estimation. We collect both the subjective emotional intensity ofthe writers themselves and the objective one annotated by the readers, and explore the differences between them. In our data collection, we hired 50 participants via crowdsourcing service. They annotated their own past posts on a social networking service (SNS) with the subjective emotional intensity. We also hired 3 annotators, who annotated allposts with the objective emotional intensity. Consequently, our Japanese emotion analysis datasetconsists of 17,000 posts with both subjective andobjective emotional intensities for Plutchik’s eightemotions ([Plutchik, 1980](https://www.sciencedirect.com/science/article/pii/B9780125587013500077)), which are given in afour-point scale (no, weak, medium, and strong).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
- Japanese
## Dataset Structure
### Data Instances
When loading a specific configuration, users has to append a version dependent suffix:
```python
from datasets import load_dataset
dataset = load_dataset("shunk031/wrime", name="ver1")
print(dataset)
# DatasetDict({
# train: Dataset({
# features: ['sentence', 'user_id', 'datetime', 'writer', 'reader1', 'reader2', 'reader3', 'avg_readers'],
# num_rows: 40000
# })
# validation: Dataset({
# features: ['sentence', 'user_id', 'datetime', 'writer', 'reader1', 'reader2', 'reader3', 'avg_readers'],
# num_rows: 1200
# })
# test: Dataset({
# features: ['sentence', 'user_id', 'datetime', 'writer', 'reader1', 'reader2', 'reader3', 'avg_readers'],
# num_rows: 2000
# })
# })
```
#### Ver. 1
An example of looks as follows:
```json
{
"sentence": "ぼけっとしてたらこんな時間。チャリあるから食べにでたいのに…",
"user_id": "1",
"datetime": "2012/07/31 23:48",
"writer": {
"joy": 0,
"sadness": 1,
"anticipation": 2,
"surprise": 1,
"anger": 1,
"fear": 0,
"disgust": 0,
"trust": 1
},
"reader1": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 0,
"anger": 0,
"fear": 0,
"disgust": 0,
"trust": 0
},
"reader2": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 1,
"anger": 0,
"fear": 0,
"disgust": 0,
"trust": 0
},
"reader3": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 0,
"anger": 0,
"fear": 1,
"disgust": 1,
"trust": 0
},
"avg_readers": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 0,
"anger": 0,
"fear": 0,
"disgust": 0,
"trust": 0
}
}
```
#### Ver. 1
An example of looks as follows:
```json
{
"sentence": "ぼけっとしてたらこんな時間。チャリあるから食べにでたいのに…",
"user_id": "1",
"datetime": "2012/7/31 23:48",
"writer": {
"joy": 0,
"sadness": 1,
"anticipation": 2,
"surprise": 1,
"anger": 1,
"fear": 0,
"disgust": 0,
"trust": 1,
"sentiment": 0
},
"reader1": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 0,
"anger": 0,
"fear": 0,
"disgust": 0,
"trust": 0,
"sentiment": -2
},
"reader2": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 0,
"anger": 0,
"fear": 1,
"disgust": 1,
"trust": 0,
"sentiment": -1
},
"reader3": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 1,
"anger": 0,
"fear": 0,
"disgust": 0,
"trust": 0,
"sentiment": -1
},
"avg_readers": {
"joy": 0,
"sadness": 2,
"anticipation": 0,
"surprise": 0,
"anger": 0,
"fear": 0,
"disgust": 0,
"trust": 0,
"sentiment": -1
}
}
```
### Data Fields
#### Ver. 1
- `sentence`: 投稿テキスト
- `user_id`: ユーザー ID
- `datetime`: 投稿日時
- `writer`: 主観 (書き手)
- `joy`: 主観の喜びの感情
- `sadness`: 主観の悲しみの感情
- `anticipation`: 主観の期待の感情
- `surprise`: 主観の驚きの感情
- `anger`: 主観の怒りの感情
- `fear`: 主観の恐れの感情
- `disgust`: 主観の嫌悪の感情
- `trust`: 主観の信頼の感情
- `reader1`: 客観 A (読み手 A)
- `joy`: 客観 A の喜びの感情
- `sadness`: 客観 A の悲しみの感情
- `anticipation`: 客観 A の期待の感情
- `surprise`: 客観 A の驚きの感情
- `anger`: 客観 A の怒りの感情
- `fear`: 客観 A の恐れの感情
- `disgust`: 客観 A の嫌悪の感情
- `trust`: 客観 A の信頼の感情
- `reader2`: 客観 B (読み手 B)
- `joy`: 客観 B の喜びの感情
- `sadness`: 客観 B の悲しみの感情
- `anticipation`: 客観 B の期待の感情
- `surprise`: 客観 B の驚きの感情
- `anger`: 客観 B の怒りの感情
- `fear`: 客観 B の恐れの感情
- `disgust`: 客観 B の嫌悪の感情
- `trust`: 客観 B の信頼の感情
- `reader3`: 客観 C (読み手 C)
- `joy`: 客観 C の喜びの感情
- `sadness`: 客観 C の悲しみの感情
- `anticipation`: 客観 C の期待の感情
- `surprise`: 客観 C の驚きの感情
- `anger`: 客観 C の怒りの感情
- `fear`: 客観 C の恐れの感情
- `disgust`: 客観 C の嫌悪の感情
- `trust`: 客観 C の信頼の感情
- `avg_readers`
- `joy`: 客観 A, B, C 平均の喜びの感情
- `sadness`: 客観 A, B, C 平均の悲しみの感情
- `anticipation`: 客観 A, B, C 平均の期待の感情
- `surprise`: 客観 A, B, C 平均の驚きの感情
- `anger`: 客観 A, B, C 平均の怒りの感情
- `fear`: 客観 A, B, C 平均の恐れの感情
- `disgust`: 客観 A, B, C 平均の嫌悪の感情
- `trust`: 客観 A, B, C 平均の信頼の感情
#### Ver. 2
- `sentence`: 投稿テキスト
- `user_id`: ユーザー ID
- `datetime`: 投稿日時
- `writer`: 主観 (書き手)
- `joy`: 主観の喜びの感情
- `sadness`: 主観の悲しみの感情
- `anticipation`: 主観の期待の感情
- `surprise`: 主観の驚きの感情
- `anger`: 主観の怒りの感情
- `fear`: 主観の恐れの感情
- `disgust`: 主観の嫌悪の感情
- `trust`: 主観の信頼の感情
- `sentiment`: 主観の感情極性
- `reader1`: 客観 A (読み手 A)
- `joy`: 客観 A の喜びの感情
- `sadness`: 客観 A の悲しみの感情
- `anticipation`: 客観 A の期待の感情
- `surprise`: 客観 A の驚きの感情
- `anger`: 客観 A の怒りの感情
- `fear`: 客観 A の恐れの感情
- `disgust`: 客観 A の嫌悪の感情
- `trust`: 客観 A の信頼の感情
- `sentiment`: 客観 A の感情極性
- `reader2`: 客観 B (読み手 B)
- `joy`: 客観 B の喜びの感情
- `sadness`: 客観 B の悲しみの感情
- `anticipation`: 客観 B の期待の感情
- `surprise`: 客観 B の驚きの感情
- `anger`: 客観 B の怒りの感情
- `fear`: 客観 B の恐れの感情
- `disgust`: 客観 B の嫌悪の感情
- `trust`: 客観 B の信頼の感情
- `sentiment`: 客観 B の感情極性
- `reader3`: 客観 C (読み手 C)
- `joy`: 客観 C の喜びの感情
- `sadness`: 客観 C の悲しみの感情
- `anticipation`: 客観 C の期待の感情
- `surprise`: 客観 C の驚きの感情
- `anger`: 客観 C の怒りの感情
- `fear`: 客観 C の恐れの感情
- `disgust`: 客観 C の嫌悪の感情
- `trust`: 客観 C の信頼の感情
- `sentiment`: 客観 C の感情極性
- `avg_readers`
- `joy`: 客観 A, B, C 平均の喜びの感情
- `sadness`: 客観 A, B, C 平均の悲しみの感情
- `anticipation`: 客観 A, B, C 平均の期待の感情
- `surprise`: 客観 A, B, C 平均の驚きの感情
- `anger`: 客観 A, B, C 平均の怒りの感情
- `fear`: 客観 A, B, C 平均の恐れの感情
- `disgust`: 客観 A, B, C 平均の嫌悪の感情
- `trust`: 客観 A, B, C 平均の信頼の感情
- `sentiment`: 客観 A, B, C 平均の感情極性
### Data Splits
| name | train | validation | test |
|------|-------:|-----------:|------:|
| ver1 | 40,000 | 1,200 | 2,000 |
| ver2 | 30,000 | 2,500 | 2,500 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
From [the README](https://github.com/ids-cv/wrime/blob/master/README.en.md#licence) of the GitHub:
- The dataset is available for research purposes only.
- Redistribution of the dataset is prohibited.
### Citation Information
```bibtex
@inproceedings{kajiwara-etal-2021-wrime,
title = "{WRIME}: A New Dataset for Emotional Intensity Estimation with Subjective and Objective Annotations",
author = "Kajiwara, Tomoyuki and
Chu, Chenhui and
Takemura, Noriko and
Nakashima, Yuta and
Nagahara, Hajime",
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.naacl-main.169",
doi = "10.18653/v1/2021.naacl-main.169",
pages = "2095--2104",
abstract = "We annotate 17,000 SNS posts with both the writer{'}s subjective emotional intensity and the reader{'}s objective one to construct a Japanese emotion analysis dataset. In this study, we explore the difference between the emotional intensity of the writer and that of the readers with this dataset. We found that the reader cannot fully detect the emotions of the writer, especially anger and trust. In addition, experimental results in estimating the emotional intensity show that it is more difficult to estimate the writer{'}s subjective labels than the readers{'}. The large gap between the subjective and objective emotions imply the complexity of the mapping from a post to the subjective emotion intensities, which also leads to a lower performance with machine learning models.",
}
```
```bibtex
@inproceedings{suzuki-etal-2022-japanese,
title = "A {J}apanese Dataset for Subjective and Objective Sentiment Polarity Classification in Micro Blog Domain",
author = "Suzuki, Haruya and
Miyauchi, Yuto and
Akiyama, Kazuki and
Kajiwara, Tomoyuki and
Ninomiya, Takashi and
Takemura, Noriko and
Nakashima, Yuta and
Nagahara, Hajime",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.759",
pages = "7022--7028",
abstract = "We annotate 35,000 SNS posts with both the writer{'}s subjective sentiment polarity labels and the reader{'}s objective ones to construct a Japanese sentiment analysis dataset. Our dataset includes intensity labels (\textit{none}, \textit{weak}, \textit{medium}, and \textit{strong}) for each of the eight basic emotions by Plutchik (\textit{joy}, \textit{sadness}, \textit{anticipation}, \textit{surprise}, \textit{anger}, \textit{fear}, \textit{disgust}, and \textit{trust}) as well as sentiment polarity labels (\textit{strong positive}, \textit{positive}, \textit{neutral}, \textit{negative}, and \textit{strong negative}). Previous studies on emotion analysis have studied the analysis of basic emotions and sentiment polarity independently. In other words, there are few corpora that are annotated with both basic emotions and sentiment polarity. Our dataset is the first large-scale corpus to annotate both of these emotion labels, and from both the writer{'}s and reader{'}s perspectives. In this paper, we analyze the relationship between basic emotion intensity and sentiment polarity on our dataset and report the results of benchmarking sentiment polarity classification.",
}
```
### Contributions
Thanks to [@moguranosenshi](https://github.com/moguranosenshi) for creating this dataset.
|
dominguesm/alpaca-data-pt-br | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 18324606
num_examples: 51759
download_size: 11508709
dataset_size: 18324606
license: cc-by-nc-4.0
language:
- pt
tags:
- instruction-finetuning
pretty_name: Alpaca-Cleaned-PTBR
task_categories:
- text-generation
---
**NOTE:** This is a machine translated version of the [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) dataset.
# Dataset Card for Alpaca-Cleaned
- **Repository:** https://github.com/gururise/AlpacaDataCleaned
## Dataset Description
This is a cleaned version of the original Alpaca Dataset released by Stanford. The following issues have been identified in the original release and fixed in this dataset:
1. **Hallucinations:** Many instructions in the original dataset had instructions referencing data on the internet, which just caused GPT3 to hallucinate an answer.
```
"instruction":"Summarize the given article in 200 Words.",
"input": "https://www.bbc.com/news/world-51461830",
"output": "The recent protests in Belarus are part of a growing wave of discontent against the rule of Alexander Lukashenko, who has led the country since 1994. Demonstrations against his government have been ongoing since August, and come in the wake of an election widely seen as rigged. As police and security forces have cracked down on protesters, the movement has only grown, with hundreds of thousands of people taking to the streets to call for new leadership. Human rights activists have accused the government of using extreme violence in order prosecute protesters, leading to an international outcry and sanctions from the European Union. As the situation in Belarus continues to unfold, the international community is monitoring the situation closely - hoping for a peaceful resolution, while also standing firm against the oppressive tactics employed by the Belarusian state."
"instruction": "Classify the following shape according to simple geometry.",
"input": "<Picture Attached>",
"output": "Triangle"
"instruction": "Given a recent news article, write a headline for it.",
"input": "https://www.news.example.com/article-1",
"output": "\"NASA Launches a Mars 2020 Mission to Search for Signs of Ancient Life\""
```
2. **Merged Instructions:** There were many instructions that were merged together in the original dataset for some reason.
```
"instruction": "Should I invest in stocks?",
"input": "",
"output": "It depends on your individual situation. Investing in stocks carries a certain amount of risk but can also offer potential rewards. Make sure to research the different types of stocks, understand the risks and rewards, and develop a plan that works for you.\n##\n18. Instruction: Identify the noun in the following sentence:\n18. Input:\nThe large apple was delicious.\n18. Output:\nApple"
```
3. **Empty outputs:** Some entries in the original dataset had empty outputs.
```
"instruction": "Calculate the area of the following shape in square centimeters.",
"input": "rectangle of size 4 cm x 5 cm",
"output": ""
```
4. **Empty code examples:** Some descriptions in the original dataset were missing code examples, making it difficult to understand the intended behavior of the code.
5. **Instructions to generate images:** Some descriptions in the original dataset included instructions to generate images, something obviously not possible.
```
"instruction": "Create a graphic or logo that visually represents the word \"courage\".",
"input": "",
"output": "<No Output>"
```
6. **N/A outputs:** Some code snippets in the original dataset had N/A outputs.
7. **Inconsistent input field:** The original dataset had inconsistent usage of the input field when it was supposed to be empty.
```
"input":"<no input>"
"input":"No input"
"input":"noinput"
"input":"<noinput>"
```
8. **Wrong answers:** Some instructions/questions in the original dataset had incorrect answers. About 80% of the math problems are estimated to have incorrect answers.
```
"instruction": "Calculate the median of the following data set.",
"input": "1, 2, 4, 5, 8, 9",
"output": "5"
"instruction": "Convert 25m to km.",
"input": "",
"output": "25km"
```
9. **Non-Sensical/Unclear instructions:** Many instructions are unclear, we try to clarify (or re-write) if instructions are non-sensical. Instructions that are slightly unclear, but where one could deduce the meaning are not altered.
```
"instruction": "Freeze the following sample of yogurt for 10 minutes.",
"input": "Yogurt sample",
"output": "<noinput>"
"instruction": "Increase the font size to 12 points.",
"input": "",
"output": "The font size has been increased to 12 points."
```
10. **Extraneous escape and control characters:** The original dataset had several entries with extraneous escape and control characters.
### Original Alpaca Dataset Summary
Alpaca is a dataset of 52,000 instructions and demonstrations generated by OpenAI's `text-davinci-003` engine. This instruction data can be used to conduct instruction-tuning for language models and make the language model follow instruction better.
The authors built on the data generation pipeline from [Self-Instruct framework](https://github.com/yizhongw/self-instruct) and made the following modifications:
- The `text-davinci-003` engine to generate the instruction data instead of `davinci`.
- A [new prompt](https://github.com/tatsu-lab/stanford_alpaca/blob/main/prompt.txt) was written that explicitly gave the requirement of instruction generation to `text-davinci-003`.
- Much more aggressive batch decoding was used, i.e., generating 20 instructions at once, which significantly reduced the cost of data generation.
- The data generation pipeline was simplified by discarding the difference between classification and non-classification instructions.
- Only a single instance was generated for each instruction, instead of 2 to 3 instances as in Self-Instruct.
This produced an instruction-following dataset with 52K examples obtained at a much lower cost (less than $500).
In a preliminary study, the authors also found that the 52K generated data to be much more diverse than the data released by [Self-Instruct](https://github.com/yizhongw/self-instruct/blob/main/data/seed_tasks.jsonl).
### Supported Tasks and Leaderboards
The Alpaca dataset designed for instruction training pretrained language models.
### Languages
The data in Alpaca are in English (BCP-47 en).
## Dataset Structure
### Data Instances
An example of "train" looks as follows:
```json
{
"instruction": "Create a classification task by clustering the given list of items.",
"input": "Apples, oranges, bananas, strawberries, pineapples",
"output": "Class 1: Apples, Oranges\nClass 2: Bananas, Strawberries\nClass 3: Pineapples",
"text": "Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nCreate a classification task by clustering the given list of items.\n\n### Input:\nApples, oranges, bananas, strawberries, pineapples\n\n### Response:\nClass 1: Apples, Oranges\nClass 2: Bananas, Strawberries\nClass 3: Pineapples",
}
```
### Data Fields
The data fields are as follows:
* `instruction`: describes the task the model should perform. Each of the 52K instructions is unique.
* `input`: optional context or input for the task. For example, when the instruction is "Summarize the following article", the input is the article. Around 40% of the examples have an input.
* `output`: the answer to the instruction as generated by `text-davinci-003`.
* `text`: the `instruction`, `input` and `output` formatted with the [prompt template](https://github.com/tatsu-lab/stanford_alpaca#data-release) used by the authors for fine-tuning their models.
### Data Splits
| | train |
|---------------|------:|
| alpaca | 52002 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
Excerpt the [blog post](https://crfm.stanford.edu/2023/03/13/alpaca.html) accompanying the release of this dataset:
> We believe that releasing the above assets will enable the academic community to perform controlled scientific studies on instruction-following language models, resulting in better science and ultimately new techniques to address the existing deficiencies with these models. At the same time, any release carries some risk. First, we recognize that releasing our training recipe reveals the feasibility of certain capabilities. On one hand, this enables more people (including bad actors) to create models that could cause harm (either intentionally or not). On the other hand, this awareness might incentivize swift defensive action, especially from the academic community, now empowered by the means to perform deeper safety research on such models. Overall, we believe that the benefits for the research community outweigh the risks of this particular release. Given that we are releasing the training recipe, we believe that releasing the data, model weights, and training code incur minimal further risk, given the simplicity of the recipe. At the same time, releasing these assets has enormous benefits for reproducible science, so that the academic community can use standard datasets, models, and code to perform controlled comparisons and to explore extensions. Deploying an interactive demo for Alpaca also poses potential risks, such as more widely disseminating harmful content and lowering the barrier for spam, fraud, or disinformation. We have put into place two risk mitigation strategies. First, we have implemented a content filter using OpenAI’s content moderation API, which filters out harmful content as defined by OpenAI’s usage policies. Second, we watermark all the model outputs using the method described in Kirchenbauer et al. 2023, so that others can detect (with some probability) whether an output comes from Alpaca 7B. Finally, we have strict terms and conditions for using the demo; it is restricted to non-commercial uses and to uses that follow LLaMA’s license agreement. We understand that these mitigation measures can be circumvented once we release the model weights or if users train their own instruction-following models. However, by installing these mitigations, we hope to advance the best practices and ultimately develop community norms for the responsible deployment of foundation models.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
The `alpaca` data is generated by a language model (`text-davinci-003`) and inevitably contains some errors or biases. We encourage users to use this data with caution and propose new methods to filter or improve the imperfections.
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is available under the [Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/).
### Citation Information
```
@misc{alpaca,
author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
title = {Stanford Alpaca: An Instruction-following LLaMA model},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}
```
### Contributions
[More Information Needed] |
pythainlp/thainer-corpus-v2.2 | ---
language:
- th
license: cc-by-3.0
task_categories:
- token-classification
dataset_info:
features:
- name: words
sequence: string
- name: ner
sequence:
class_label:
names:
'0': B-PERSON
'1': I-PERSON
'2': O
'3': B-ORGANIZATION
'4': B-LOCATION
'5': I-ORGANIZATION
'6': I-LOCATION
'7': B-DATE
'8': I-DATE
'9': B-TIME
'10': I-TIME
'11': B-MONEY
'12': I-MONEY
'13': B-FACILITY
'14': I-FACILITY
'15': B-URL
'16': I-URL
'17': B-PERCENT
'18': I-PERCENT
'19': B-LEN
'20': I-LEN
'21': B-AGO
'22': I-AGO
'23': B-LAW
'24': I-LAW
'25': B-PHONE
'26': I-PHONE
'27': B-EMAIL
'28': I-EMAIL
'29': B-ZIP
'30': B-TEMPERATURE
'31': I-TEMPERATURE
splits:
- name: train
num_bytes: 3739947
num_examples: 4379
- name: validation
num_bytes: 1215876
num_examples: 1475
- name: test
num_bytes: 1243881
num_examples: 1472
download_size: 999069
dataset_size: 6199704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Thai NER v2.2
Thai Named Entity Recognition Corpus
**You can download .conll to train named entity model in [https://zenodo.org/records/10795907](https://zenodo.org/records/10795907).**
**Size**
- Train: 3,938 docs
- Validation: 1,313 docs
- Test: 1,313 Docs
Some data come from crowdsourcing between Dec 2018 - Nov 2019. [https://github.com/wannaphong/thai-ner](https://github.com/wannaphong/thai-ner)
**Domain**
- News (It, politics, economy, social)
- PR (KKU news)
- general
**Source**
- I use sone data from Nutcha’s theses (http://pioneer.chula.ac.th/~awirote/Data-Nutcha.zip) and improve data by rechecking and adding more tagging.
- Twitter
- Blognone.com - It news
- thaigov.go.th
- kku.ac.th
And more (the lists are lost.)
**Tag**
- DATE - date
- TIME - time
- EMAIL - email
- LEN - length
- LOCATION - Location
- ORGANIZATION - Company / Organization
- PERSON - Person name
- PHONE - phone number
- TEMPERATURE - temperature
- URL - URL
- ZIP - Zip code
- MONEY - the amount
- LAW - legislation
- PERCENT - PERCENT
## Cite
> Wannaphong Phatthiyaphaibun. (2024). Thai NER 2.2 (2.2) [Data set]. Zenodo. https://doi.org/10.5281/zenodo.10795907
or BibTeX
```
@dataset{wannaphong_phatthiyaphaibun_2024_10795907,
author = {Wannaphong Phatthiyaphaibun},
title = {Thai NER 2.2},
month = mar,
year = 2024,
publisher = {Zenodo},
version = {2.2},
doi = {10.5281/zenodo.10795907},
url = {https://doi.org/10.5281/zenodo.10795907}
}
``` |
Tristan/wikipedia-august-october-line-diff-1000-char-threshold-1000-sample | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
- name: crawl_timestamp
dtype: int64
- name: reward
dtype: int64
splits:
- name: train
num_bytes: 1431232
num_examples: 1000
download_size: 567286
dataset_size: 1431232
---
# Dataset Card for "wikipedia-august-october-line-diff-1000-char-threshold-1000-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangshuoming/switch_subset_exebench | ---
dataset_info:
features:
- name: train_real_simple_io
struct:
- name: asm
struct:
- name: code
sequence: string
- name: target
sequence: string
- name: fname
dtype: string
- name: func_def
dtype: string
- name: func_head
dtype: string
- name: func_head_types
dtype: string
- name: path
dtype: string
- name: real_deps
dtype: string
- name: real_exe_wrapper
dtype: string
- name: real_io_pairs
struct:
- name: dummy_funcs
sequence: 'null'
- name: dummy_funcs_seed
sequence: 'null'
- name: input
list:
- name: value
sequence: string
- name: var
sequence: string
- name: output
list:
- name: value
sequence: string
- name: var
sequence: string
- name: real_iospec
dtype: string
- name: ref
dtype: string
- name: signature
sequence: string
- name: synth_deps
dtype: string
- name: synth_exe_wrapper
dtype: string
- name: synth_io_pairs
struct:
- name: dummy_funcs
sequence: string
- name: dummy_funcs_seed
sequence: int64
- name: input
list:
- name: value
sequence: string
- name: var
sequence: string
- name: output
list:
- name: value
sequence: string
- name: var
sequence: string
- name: synth_iospec
dtype: string
splits:
- name: train
num_bytes: 17599214
num_examples: 1107
download_size: 3846323
dataset_size: 17599214
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ajankelo/pklot_50 | ---
language: en
license: cc-by-4.0
tags:
- PKLot
- object detection
---
# PKLot 50
This dataset comprises 50 fully annotated images. The original images are were introduced in [*PKLot – A robust dataset for parking lot classification*](https://www.inf.ufpr.br/lesoliveira/download/ESWA2015.pdf).
## Labeling Method
Labeling was manually completed using CVAT with the assistance of Voxel51 for inspection.
## Original dataset citation info
Almeida, P., Oliveira, L. S., Silva Jr, E., Britto Jr, A., Koerich, A., PKLot – A robust dataset for parking lot classification, Expert Systems with Applications, 42(11):4937-4949, 2015.
|
open-llm-leaderboard/details_Test157t__Kunocchini-1.2-7b-longtext | ---
pretty_name: Evaluation run of Test157t/Kunocchini-1.2-7b-longtext
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Test157t/Kunocchini-1.2-7b-longtext](https://huggingface.co/Test157t/Kunocchini-1.2-7b-longtext)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Kunocchini-1.2-7b-longtext\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T19:52:16.065136](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-1.2-7b-longtext/blob/main/results_2024-02-29T19-52-16.065136.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6267864351550836,\n\
\ \"acc_stderr\": 0.0325192896182257,\n \"acc_norm\": 0.6332530550898832,\n\
\ \"acc_norm_stderr\": 0.0331825601519831,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.41719922898961076,\n\
\ \"mc2_stderr\": 0.014076694537315668\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5622866894197952,\n \"acc_stderr\": 0.014497573881108283,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6313483369846644,\n\
\ \"acc_stderr\": 0.004814532642574651,\n \"acc_norm\": 0.8251344353714399,\n\
\ \"acc_norm_stderr\": 0.0037907576465758975\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"\
acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955293,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955293\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n\
\ \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n\
\ \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n\
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899126,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899126\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.015461169002371537,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.015461169002371537\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024647,\n \"mc2\": 0.41719922898961076,\n\
\ \"mc2_stderr\": 0.014076694537315668\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698334\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3290371493555724,\n \
\ \"acc_stderr\": 0.012942375603679364\n }\n}\n```"
repo_url: https://huggingface.co/Test157t/Kunocchini-1.2-7b-longtext
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|arc:challenge|25_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|gsm8k|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hellaswag|10_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-52-16.065136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T19-52-16.065136.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- '**/details_harness|winogrande|5_2024-02-29T19-52-16.065136.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T19-52-16.065136.parquet'
- config_name: results
data_files:
- split: 2024_02_29T19_52_16.065136
path:
- results_2024-02-29T19-52-16.065136.parquet
- split: latest
path:
- results_2024-02-29T19-52-16.065136.parquet
---
# Dataset Card for Evaluation run of Test157t/Kunocchini-1.2-7b-longtext
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Kunocchini-1.2-7b-longtext](https://huggingface.co/Test157t/Kunocchini-1.2-7b-longtext) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Kunocchini-1.2-7b-longtext",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T19:52:16.065136](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Kunocchini-1.2-7b-longtext/blob/main/results_2024-02-29T19-52-16.065136.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6267864351550836,
"acc_stderr": 0.0325192896182257,
"acc_norm": 0.6332530550898832,
"acc_norm_stderr": 0.0331825601519831,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.41719922898961076,
"mc2_stderr": 0.014076694537315668
},
"harness|arc:challenge|25": {
"acc": 0.5622866894197952,
"acc_stderr": 0.014497573881108283,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719867
},
"harness|hellaswag|10": {
"acc": 0.6313483369846644,
"acc_stderr": 0.004814532642574651,
"acc_norm": 0.8251344353714399,
"acc_norm_stderr": 0.0037907576465758975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955293,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955293
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899126,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899126
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371537,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024647,
"mc2": 0.41719922898961076,
"mc2_stderr": 0.014076694537315668
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698334
},
"harness|gsm8k|5": {
"acc": 0.3290371493555724,
"acc_stderr": 0.012942375603679364
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yuyuc/chem-uspto | ---
license: openrail
---
|
dhivyamadhavan/task2_dataset | ---
dataset_info:
features:
- name: messages
dtype: string
splits:
- name: train_ift
num_bytes: 16141
num_examples: 45
download_size: 9714
dataset_size: 16141
configs:
- config_name: default
data_files:
- split: train_ift
path: data/train_ift-*
---
|
kkyo/Nagisin | ---
license: openrail
---
|
danigr7/prueba | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': crazing
'1': inclusion
'2': patches
'3': pitted_surface
'4': rolled-in_scale
'5': scratches
splits:
- name: train
num_bytes: 22821674.88
num_examples: 1440
- name: validation
num_bytes: 5395918
num_examples: 360
download_size: 26736558
dataset_size: 28217592.88
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
task_categories:
- image-classification
language:
- en
--- |
goodfellowliu/Urban100 | ---
license: apache-2.0
---
|
hotchpotch/jaqket_cc | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: competition
dtype: string
- name: timestamp
dtype: string
- name: section
dtype: string
- name: original_question
dtype: string
- name: original_answer
dtype: string
- name: original_additional_info
dtype: string
- name: number
dtype: string
splits:
- name: aio_01_test
num_bytes: 900806
num_examples: 2000
- name: aio_01_dev
num_bytes: 868722
num_examples: 1992
- name: aio_01_unused
num_bytes: 267601
num_examples: 608
download_size: 1090027
dataset_size: 2037129
configs:
- config_name: default
data_files:
- split: aio_01_test
path: v2/aio_01_test-*
- split: aio_01_dev
path: v2/aio_01_dev-*
- split: aio_01_unused
path: v2/aio_01_unused-*
task_categories:
- question-answering
language:
- ja
---
# JAQKET から CC-BY-SA のデータのみを含めたデータセット
[AI王 公式配布データセット(JAQKET) ](https://sites.google.com/view/project-aio/dataset?authuser=0)で配布されているクイズデータのうち、ライセンスが [CC-BY-SA-4.0](https://creativecommons.org/licenses/by-sa/4.0/deed.ja)のデータのみを含めたデータセットです。
## 謝辞
AI王の開催・クイズ作成等々をされた関係者の方々に、有益なデータセットの公開ありがとうございます。
## 利用方法
```
from datasets import load_dataset
ds = load_dataset("hotchpotch/jaqket_cc")
ds["aio_01_dev"][0]
#=> {'qid': 'QA20CAPR-0002',
'question': '明治時代に西洋から伝わった「テーブル・ターニング」に起源を持つ占いの一種で、50音表などを記入した紙を置き、参加者全員の人差し指をコインに置いて行うのは何でしょう?',
'answers': ['コックリさん'],
'competition': '第1回AI王',
'timestamp': '2019/12/25',
'section': '開発データ問題 (dev1)',
'original_question': '明治時代に西洋から伝わった「テーブル・ターニング」に起源を持つ占いの一種で、50音表などを記入した紙を置き、参加者全員の人差し指をコインに置いて行うのは何でしょう?',
'original_answer': 'コックリさん',
'original_additional_info': '',
'number': '2'}
```
|
mask-distilled-one-sec-cv12/chunk_139 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1130164308
num_examples: 221949
download_size: 1151665269
dataset_size: 1130164308
---
# Dataset Card for "chunk_139"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
scfengv/TVL_Sentiment_Analysis | ---
task_categories:
- text-classification
language:
- zh
--- |
ineoApp/factures-99 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': numero facture
'2': Telephone
'3': Email
'4': Site web
'5': RC
'6': CNSS
'7': TP
'8': Mode de paiement
'9': fournisseur
'10': date facture
'11': date limite
'12': montant ht
'13': montant ttc
'14': tva
'15': prix tva
'16': addresse
'17': reference
'18': Devise
'19': ICE fournisseur
'20': IF fournisseur
'21': Condition de paiement
'22': informations
'23': art1 designation
'24': art1 quantite
'25': art1 prix unit
'26': art1 tva
'27': art1 montant ht
'28': art1 Article
'29': art1 taux de remise
'30': art2 designation
'31': art2 quantite
'32': art2 prix unit
'33': art2 tva
'34': art2 montant ht
'35': art2 Article
'36': art2 taux de remise
'37': art3 designation
'38': art3 quantite
'39': art3 prix unit
'40': art3 tva
'41': art3 montant ht
'42': art3 Article
'43': art3 taux de remise
'44': art4 designation
'45': art4 quantite
'46': art4 prix unit
'47': art4 tva
'48': art4 montant ht
'49': art4 Article
'50': art4 taux de remise
'51': art5 designation
'52': art5 quantite
'53': art5 prix unit
'54': art5 tva
'55': art5 montant ht
'56': art5 Article
'57': art5 taux de remise
'58': art6 designation
'59': art6 quantite
'60': art6 prix unit
'61': art6 tva
'62': art6 montant ht
'63': art6 Article
'64': art6 taux de remise
'65': art7 designation
'66': art7 quantite
'67': art7 prix unit
'68': art7 tva
'69': art7 montant ht
'70': art7 Article
'71': art7 taux de remise
'72': art8 designation
'73': art8 quantite
'74': art8 prix unit
'75': art8 tva
'76': art8 montant ht
'77': art8 Article
'78': art8 taux de remise
'79': art9 designation
'80': art9 quantite
'81': art9 prix unit
'82': art9 tva
'83': art9 montant ht
'84': art9 Article
'85': art9 taux de remise
'86': art10 designation
'87': art10 quantite
'88': art10 prix unit
'89': art10 tva
'90': art10 montant ht
'91': art10 Article
'92': art10 taux de remise
'93': art11 designation
'94': art11 quantite
'95': art11 prix unit
'96': art11 tva
'97': art11 montant ht
'98': art11 Article
'99': art11 taux de remise
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 83534936.8
num_examples: 60
- name: test
num_bytes: 20883734.2
num_examples: 15
download_size: 96896920
dataset_size: 104418671.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_rte_bare_past_tense | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 695663
num_examples: 1862
- name: train
num_bytes: 604595
num_examples: 1545
download_size: 844197
dataset_size: 1300258
---
# Dataset Card for "MULTI_VALUE_rte_bare_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/chemistry_dataset_standardized_cluster_2_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4782401
num_examples: 6678
download_size: 1923152
dataset_size: 4782401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry_dataset_standardized_cluster_2_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hate_speech18 | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- intent-classification
paperswithcode_id: hate-speech
pretty_name: Hate Speech
dataset_info:
features:
- name: text
dtype: string
- name: user_id
dtype: int64
- name: subforum_id
dtype: int64
- name: num_contexts
dtype: int64
- name: label
dtype:
class_label:
names:
'0': noHate
'1': hate
'2': idk/skip
'3': relation
splits:
- name: train
num_bytes: 1375340
num_examples: 10944
download_size: 3664530
dataset_size: 1375340
train-eval-index:
- config: default
task: text-classification
task_id: multi_class_classification
splits:
train_split: train
col_mapping:
text: text
label: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/Vicomtech/hate-speech-dataset
- **Repository:** https://github.com/Vicomtech/hate-speech-dataset
- **Paper:** https://www.aclweb.org/anthology/W18-51.pdf
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
These files contain text extracted from Stormfront, a white supremacist forum. A random set of forums posts have been sampled from
several subforums and split into sentences. Those sentences have been manually labelled as containing hate speech or not, according
to certain annotation guidelines.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
- text: the provided sentence
- user_id: information to make it possible to re-build the conversations these sentences belong to
- subforum_id: information to make it possible to re-build the conversations these sentences belong to
- num_contexts: number of previous posts the annotator had to read before making a decision over the category of the sentence
- label: hate, noHate, relation (sentence in the post doesn't contain hate speech on their own, but combination of serveral sentences does)
or idk/skip (sentences that are not written in English or that don't contain information as to be classified into hate or noHate)
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{gibert2018hate,
title = "{Hate Speech Dataset from a White Supremacy Forum}",
author = "de Gibert, Ona and
Perez, Naiara and
Garc{\'\i}a-Pablos, Aitor and
Cuadros, Montse",
booktitle = "Proceedings of the 2nd Workshop on Abusive Language Online ({ALW}2)",
month = oct,
year = "2018",
address = "Brussels, Belgium",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/W18-5102",
doi = "10.18653/v1/W18-5102",
pages = "11--20",
}
```
### Contributions
Thanks to [@czabo](https://github.com/czabo) for adding this dataset. |
mmuttharasan/llmjptk4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 109096.0
num_examples: 26
- name: test
num_bytes: 109096.0
num_examples: 26
download_size: 48848
dataset_size: 218192.0
---
# Dataset Card for "llmjptk4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-217K | ---
pretty_name: Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-217K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lodrick-the-lafted/Hermes-Instruct-7B-217K](https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-217K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-217K\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T03:50:00.047589](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-217K/blob/main/results_2024-02-21T03-50-00.047589.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6119336789446922,\n\
\ \"acc_stderr\": 0.03277328607172842,\n \"acc_norm\": 0.6158709652995045,\n\
\ \"acc_norm_stderr\": 0.03343288617558588,\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.017412941986115312,\n \"mc2\": 0.6181304894795385,\n\
\ \"mc2_stderr\": 0.01547067307242061\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137986,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892898\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6364270065723959,\n\
\ \"acc_stderr\": 0.004800446397653348,\n \"acc_norm\": 0.8264289982075284,\n\
\ \"acc_norm_stderr\": 0.0037796612246514746\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671753,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630643,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630643\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.02514180151117749,\n \
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.02514180151117749\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945266,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945266\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.01690927688493607,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.01690927688493607\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303531,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303531\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179337,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179337\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.01480538447837116,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.01480538447837116\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608408,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608408\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3564245810055866,\n\
\ \"acc_stderr\": 0.01601823971051341,\n \"acc_norm\": 0.3564245810055866,\n\
\ \"acc_norm_stderr\": 0.01601823971051341\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824087,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824087\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.017412941986115312,\n \"mc2\": 0.6181304894795385,\n\
\ \"mc2_stderr\": 0.01547067307242061\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205208\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4450341167551175,\n \
\ \"acc_stderr\": 0.013689011567414198\n }\n}\n```"
repo_url: https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-217K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|arc:challenge|25_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|gsm8k|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hellaswag|10_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-50-00.047589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T03-50-00.047589.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- '**/details_harness|winogrande|5_2024-02-21T03-50-00.047589.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T03-50-00.047589.parquet'
- config_name: results
data_files:
- split: 2024_02_21T03_50_00.047589
path:
- results_2024-02-21T03-50-00.047589.parquet
- split: latest
path:
- results_2024-02-21T03-50-00.047589.parquet
---
# Dataset Card for Evaluation run of lodrick-the-lafted/Hermes-Instruct-7B-217K
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lodrick-the-lafted/Hermes-Instruct-7B-217K](https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-217K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-217K",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T03:50:00.047589](https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-217K/blob/main/results_2024-02-21T03-50-00.047589.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6119336789446922,
"acc_stderr": 0.03277328607172842,
"acc_norm": 0.6158709652995045,
"acc_norm_stderr": 0.03343288617558588,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115312,
"mc2": 0.6181304894795385,
"mc2_stderr": 0.01547067307242061
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137986,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892898
},
"harness|hellaswag|10": {
"acc": 0.6364270065723959,
"acc_stderr": 0.004800446397653348,
"acc_norm": 0.8264289982075284,
"acc_norm_stderr": 0.0037796612246514746
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671753,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630643,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630643
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.02514180151117749,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.02514180151117749
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945266,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945266
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.01690927688493607,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.01690927688493607
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.02830465794303531,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.02830465794303531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179337,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179337
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.01480538447837116,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.01480538447837116
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608408,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3564245810055866,
"acc_stderr": 0.01601823971051341,
"acc_norm": 0.3564245810055866,
"acc_norm_stderr": 0.01601823971051341
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115312,
"mc2": 0.6181304894795385,
"mc2_stderr": 0.01547067307242061
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205208
},
"harness|gsm8k|5": {
"acc": 0.4450341167551175,
"acc_stderr": 0.013689011567414198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
camilo03soares/camilojb | ---
license: openrail
---
|
Rhitrao/YL1 | ---
license: other
---
|
weijie210/ultrafeedback_critique_score_first | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 233540577.90884757
num_examples: 59979
- name: test
num_bytes: 12288535.385390269
num_examples: 3156
download_size: 94736582
dataset_size: 245829113.29423785
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_conditional_were_was | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 189
num_examples: 1
- name: train
num_bytes: 1113
num_examples: 6
download_size: 0
dataset_size: 1302
---
# Dataset Card for "MULTI_VALUE_stsb_conditional_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_EleutherAI__gpt-neox-20b | ---
pretty_name: Evaluation run of EleutherAI/gpt-neox-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the aggregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-neox-20b\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T17:14:42.607420](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neox-20b/blob/main/results_2023-12-03T17-14-42.607420.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.05458680818802123,\n\
\ \"acc_stderr\": 0.00625744403791253\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.05458680818802123,\n \"acc_stderr\": 0.00625744403791253\n\
\ }\n}\n```"
repo_url: https://huggingface.co/EleutherAI/gpt-neox-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|arc:challenge|25_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_08T18_29_20.429481
path:
- '**/details_harness|drop|3_2023-09-08T18-29-20.429481.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-08T18-29-20.429481.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_08T18_29_20.429481
path:
- '**/details_harness|gsm8k|5_2023-09-08T18-29-20.429481.parquet'
- split: 2023_12_03T17_14_42.607420
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-14-42.607420.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-14-42.607420.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hellaswag|10_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T10:44:54.391639.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T10:44:54.391639.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_08T18_29_20.429481
path:
- '**/details_harness|winogrande|5_2023-09-08T18-29-20.429481.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-08T18-29-20.429481.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_20_23.118147
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet'
- config_name: results
data_files:
- split: 2023_07_20T10_44_54.391639
path:
- results_2023-07-20T10:44:54.391639.parquet
- split: 2023_08_28T20_20_23.118147
path:
- results_2023-08-28T20:20:23.118147.parquet
- split: 2023_09_08T18_29_20.429481
path:
- results_2023-09-08T18-29-20.429481.parquet
- split: 2023_12_03T17_14_42.607420
path:
- results_2023-12-03T17-14-42.607420.parquet
- split: latest
path:
- results_2023-12-03T17-14-42.607420.parquet
---
# Dataset Card for Evaluation run of EleutherAI/gpt-neox-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-neox-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-neox-20b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T17:14:42.607420](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neox-20b/blob/main/results_2023-12-03T17-14-42.607420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.05458680818802123,
"acc_stderr": 0.00625744403791253
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.00625744403791253
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TigerResearch/tigerbot-firefly-zh-20k | ---
license: apache-2.0
---
[Tigerbot](https://github.com/TigerResearch/TigerBot) 基于firefly数据集生成的问答sft数据
<p align="center" width="40%">
本数据集分享遵循apache-2.0协议,如来源数据有更严格的协议,将继承使用来源数据协议
## Usage
```python
import datasets
ds_sft = datasets.load_dataset('TigerResearch/tigerbot-firefly-zh-20k')
```
|
atmallen/quirky_popqa_pythia-410m_alice | ---
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: popularity
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: bob_log_odds
dtype: float64
splits:
- name: train
num_bytes: 3826956.0
num_examples: 24534
- name: validation
num_bytes: 312334.0
num_examples: 2000
- name: test
num_bytes: 311986.0
num_examples: 2000
download_size: 1551397
dataset_size: 4451276.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
DarwinAnim8or/greentext | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- machine-generated
license:
- unknown
multilinguality:
- monolingual
pretty_name: 'Greentext Dataset
This is content pulled from various archives to create a "greentext bot" or sorts using
GPT-JT-8Bit. '
size_categories: []
source_datasets: []
tags:
- grug
- internet
- greentext
task_categories:
- text2text-generation
task_ids: []
---
# Greentext Dataset
This is content pulled from various archives to create a "greentext bot" or sorts using GPT-JT.
Really, just a dumb joke I made with some friends.
## Biases & Limitations
This dataset contains charaters such as \n and u2019d that need to be filtered out manually.
Needless to say, this dataset contains *many* instances of profanity & biases, as it is trained on data from hell.
I don't recommend actually using any of this. |
bs-code/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pinecone/refinedweb-generated-questions | ---
license: mit
task_categories:
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
# Generated Questions and Answers from the Falcon RefinedWeb Dataset
This dataset contains 1k open-domain questions and answers generated using documents from Falcon's [refinedweb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) dataset using GPT-4. You can find more details about this work in the following [blogpost](https://www.pinecone.io/blog/rag-study/).
Each row consits of:
- **document_id** - an id of a text chunk from the refined web dataset, from which the question was generated. Each id contains the original document index from the refinedweb dataset, and the chunk index in the following format: "${REFINEDWEB_ID}_${CHUNK_INDEX}"
- **document_text** - the text of the chunk from which the question was generated.
- **generated_question** - the generated question.
- **generated_answer** - the corresponding generated answer. |
librarian-bots/authors_merged_dataset_prs | Invalid username or password. |
efederici/oasst2-it-top1 | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 161881
num_examples: 147
download_size: 91953
dataset_size: 161881
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jkarns/hinted_mbpp_llama2_7B_chat | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: prompt
path: data/prompt-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: task_id
dtype: int64
- name: text
dtype: string
- name: code
dtype: string
- name: test_list
sequence: string
- name: test_setup_code
dtype: string
- name: challenge_test_list
sequence: string
- name: hints
sequence: string
splits:
- name: test
num_bytes: 931659
num_examples: 500
- name: train
num_bytes: 678514
num_examples: 374
- name: prompt
num_bytes: 18915
num_examples: 10
- name: validation
num_bytes: 165965
num_examples: 90
download_size: 657926
dataset_size: 1795053
---
# Dataset Card for "hinted_mbpp_llama2_7B_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fucheng/train_data | ---
license: openrail
---
## file list
1. JK_templates.zip some templates of mock spectra
2. |
LiveEvil/autotrain-data-mysheet | ---
language:
- en
---
# AutoTrain Dataset for project: mysheet
## Dataset Description
This dataset has been automatically processed by AutoTrain for project mysheet.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "The term \u201cpseudocode\u201d refers to writing code in a humanly understandable language such as English, and breaking it down to its core concepts.",
"question": "What is pseudocode?",
"answers.text": [
"Pseudocode is breaking down your code in English."
],
"answers.answer_start": [
33
]
},
{
"context": "Python is an interactive programming language designed for API and Machine Learning use.",
"question": "What is Python?",
"answers.text": [
"Python is an interactive programming language."
],
"answers.answer_start": [
0
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 3 |
| valid | 1 |
|
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_2_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1046633
num_examples: 6510
download_size: 442006
dataset_size: 1046633
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_2_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
enakilci/fi_corpora_parliament_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 309048227
num_examples: 1969624
download_size: 175929434
dataset_size: 309048227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ctam8736/papi_asr_mini | ---
license: mit
dataset_info:
features:
- name: audio
dtype: audio
- name: transcript
dtype: string
splits:
- name: train
num_bytes: 19144271.21495327
num_examples: 85
- name: test
num_bytes: 4207914.785046729
num_examples: 22
download_size: 22633018
dataset_size: 23352186.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Cohere/wikipedia-2023-11-embed-multilingual-v3-int8-binary | ---
configs:
- config_name: "ab"
data_files:
- split: train
path: ab/*
- config_name: "ace"
data_files:
- split: train
path: ace/*
- config_name: "ady"
data_files:
- split: train
path: ady/*
- config_name: "af"
data_files:
- split: train
path: af/*
- config_name: "als"
data_files:
- split: train
path: als/*
- config_name: "alt"
data_files:
- split: train
path: alt/*
- config_name: "am"
data_files:
- split: train
path: am/*
- config_name: "ami"
data_files:
- split: train
path: ami/*
- config_name: "an"
data_files:
- split: train
path: an/*
- config_name: "ang"
data_files:
- split: train
path: ang/*
- config_name: "anp"
data_files:
- split: train
path: anp/*
- config_name: "ar"
data_files:
- split: train
path: ar/*
- config_name: "arc"
data_files:
- split: train
path: arc/*
- config_name: "ary"
data_files:
- split: train
path: ary/*
- config_name: "arz"
data_files:
- split: train
path: arz/*
- config_name: "as"
data_files:
- split: train
path: as/*
- config_name: "ast"
data_files:
- split: train
path: ast/*
- config_name: "atj"
data_files:
- split: train
path: atj/*
- config_name: "av"
data_files:
- split: train
path: av/*
- config_name: "avk"
data_files:
- split: train
path: avk/*
- config_name: "awa"
data_files:
- split: train
path: awa/*
- config_name: "ay"
data_files:
- split: train
path: ay/*
- config_name: "az"
data_files:
- split: train
path: az/*
- config_name: "azb"
data_files:
- split: train
path: azb/*
- config_name: "ba"
data_files:
- split: train
path: ba/*
- config_name: "ban"
data_files:
- split: train
path: ban/*
- config_name: "bar"
data_files:
- split: train
path: bar/*
- config_name: "bat-smg"
data_files:
- split: train
path: bat-smg/*
- config_name: "bcl"
data_files:
- split: train
path: bcl/*
- config_name: "be"
data_files:
- split: train
path: be/*
- config_name: "be-x-old"
data_files:
- split: train
path: be-x-old/*
- config_name: "bg"
data_files:
- split: train
path: bg/*
- config_name: "bh"
data_files:
- split: train
path: bh/*
- config_name: "bi"
data_files:
- split: train
path: bi/*
- config_name: "bjn"
data_files:
- split: train
path: bjn/*
- config_name: "blk"
data_files:
- split: train
path: blk/*
- config_name: "bm"
data_files:
- split: train
path: bm/*
- config_name: "bn"
data_files:
- split: train
path: bn/*
- config_name: "bo"
data_files:
- split: train
path: bo/*
- config_name: "bpy"
data_files:
- split: train
path: bpy/*
- config_name: "br"
data_files:
- split: train
path: br/*
- config_name: "bs"
data_files:
- split: train
path: bs/*
- config_name: "bug"
data_files:
- split: train
path: bug/*
- config_name: "bxr"
data_files:
- split: train
path: bxr/*
- config_name: "ca"
data_files:
- split: train
path: ca/*
- config_name: "cbk-zam"
data_files:
- split: train
path: cbk-zam/*
- config_name: "cdo"
data_files:
- split: train
path: cdo/*
- config_name: "ce"
data_files:
- split: train
path: ce/*
- config_name: "ceb"
data_files:
- split: train
path: ceb/*
- config_name: "ch"
data_files:
- split: train
path: ch/*
- config_name: "chr"
data_files:
- split: train
path: chr/*
- config_name: "chy"
data_files:
- split: train
path: chy/*
- config_name: "ckb"
data_files:
- split: train
path: ckb/*
- config_name: "co"
data_files:
- split: train
path: co/*
- config_name: "cr"
data_files:
- split: train
path: cr/*
- config_name: "crh"
data_files:
- split: train
path: crh/*
- config_name: "cs"
data_files:
- split: train
path: cs/*
- config_name: "csb"
data_files:
- split: train
path: csb/*
- config_name: "cu"
data_files:
- split: train
path: cu/*
- config_name: "cv"
data_files:
- split: train
path: cv/*
- config_name: "cy"
data_files:
- split: train
path: cy/*
- config_name: "da"
data_files:
- split: train
path: da/*
- config_name: "dag"
data_files:
- split: train
path: dag/*
- config_name: "de"
data_files:
- split: train
path: de/*
- config_name: "din"
data_files:
- split: train
path: din/*
- config_name: "diq"
data_files:
- split: train
path: diq/*
- config_name: "dsb"
data_files:
- split: train
path: dsb/*
- config_name: "dty"
data_files:
- split: train
path: dty/*
- config_name: "dv"
data_files:
- split: train
path: dv/*
- config_name: "dz"
data_files:
- split: train
path: dz/*
- config_name: "ee"
data_files:
- split: train
path: ee/*
- config_name: "el"
data_files:
- split: train
path: el/*
- config_name: "eml"
data_files:
- split: train
path: eml/*
- config_name: "en"
data_files:
- split: train
path: en/*
- config_name: "eo"
data_files:
- split: train
path: eo/*
- config_name: "es"
data_files:
- split: train
path: es/*
- config_name: "et"
data_files:
- split: train
path: et/*
- config_name: "eu"
data_files:
- split: train
path: eu/*
- config_name: "ext"
data_files:
- split: train
path: ext/*
- config_name: "fa"
data_files:
- split: train
path: fa/*
- config_name: "fat"
data_files:
- split: train
path: fat/*
- config_name: "ff"
data_files:
- split: train
path: ff/*
- config_name: "fi"
data_files:
- split: train
path: fi/*
- config_name: "fiu-vro"
data_files:
- split: train
path: fiu-vro/*
- config_name: "fj"
data_files:
- split: train
path: fj/*
- config_name: "fo"
data_files:
- split: train
path: fo/*
- config_name: "fon"
data_files:
- split: train
path: fon/*
- config_name: "fr"
data_files:
- split: train
path: fr/*
- config_name: "frp"
data_files:
- split: train
path: frp/*
- config_name: "frr"
data_files:
- split: train
path: frr/*
- config_name: "fur"
data_files:
- split: train
path: fur/*
- config_name: "fy"
data_files:
- split: train
path: fy/*
- config_name: "ga"
data_files:
- split: train
path: ga/*
- config_name: "gag"
data_files:
- split: train
path: gag/*
- config_name: "gan"
data_files:
- split: train
path: gan/*
- config_name: "gcr"
data_files:
- split: train
path: gcr/*
- config_name: "gd"
data_files:
- split: train
path: gd/*
- config_name: "gl"
data_files:
- split: train
path: gl/*
- config_name: "glk"
data_files:
- split: train
path: glk/*
- config_name: "gn"
data_files:
- split: train
path: gn/*
- config_name: "gom"
data_files:
- split: train
path: gom/*
- config_name: "gor"
data_files:
- split: train
path: gor/*
- config_name: "got"
data_files:
- split: train
path: got/*
- config_name: "gpe"
data_files:
- split: train
path: gpe/*
- config_name: "gu"
data_files:
- split: train
path: gu/*
- config_name: "guc"
data_files:
- split: train
path: guc/*
- config_name: "gur"
data_files:
- split: train
path: gur/*
- config_name: "guw"
data_files:
- split: train
path: guw/*
- config_name: "gv"
data_files:
- split: train
path: gv/*
- config_name: "ha"
data_files:
- split: train
path: ha/*
- config_name: "hak"
data_files:
- split: train
path: hak/*
- config_name: "haw"
data_files:
- split: train
path: haw/*
- config_name: "he"
data_files:
- split: train
path: he/*
- config_name: "hi"
data_files:
- split: train
path: hi/*
- config_name: "hif"
data_files:
- split: train
path: hif/*
- config_name: "hr"
data_files:
- split: train
path: hr/*
- config_name: "hsb"
data_files:
- split: train
path: hsb/*
- config_name: "ht"
data_files:
- split: train
path: ht/*
- config_name: "hu"
data_files:
- split: train
path: hu/*
- config_name: "hy"
data_files:
- split: train
path: hy/*
- config_name: "hyw"
data_files:
- split: train
path: hyw/*
- config_name: "ia"
data_files:
- split: train
path: ia/*
- config_name: "id"
data_files:
- split: train
path: id/*
- config_name: "ie"
data_files:
- split: train
path: ie/*
- config_name: "ig"
data_files:
- split: train
path: ig/*
- config_name: "ik"
data_files:
- split: train
path: ik/*
- config_name: "ilo"
data_files:
- split: train
path: ilo/*
- config_name: "inh"
data_files:
- split: train
path: inh/*
- config_name: "io"
data_files:
- split: train
path: io/*
- config_name: "is"
data_files:
- split: train
path: is/*
- config_name: "it"
data_files:
- split: train
path: it/*
- config_name: "iu"
data_files:
- split: train
path: iu/*
- config_name: "ja"
data_files:
- split: train
path: ja/*
- config_name: "jam"
data_files:
- split: train
path: jam/*
- config_name: "jbo"
data_files:
- split: train
path: jbo/*
- config_name: "jv"
data_files:
- split: train
path: jv/*
- config_name: "ka"
data_files:
- split: train
path: ka/*
- config_name: "kaa"
data_files:
- split: train
path: kaa/*
- config_name: "kab"
data_files:
- split: train
path: kab/*
- config_name: "kbd"
data_files:
- split: train
path: kbd/*
- config_name: "kbp"
data_files:
- split: train
path: kbp/*
- config_name: "kcg"
data_files:
- split: train
path: kcg/*
- config_name: "kg"
data_files:
- split: train
path: kg/*
- config_name: "ki"
data_files:
- split: train
path: ki/*
- config_name: "kk"
data_files:
- split: train
path: kk/*
- config_name: "kl"
data_files:
- split: train
path: kl/*
- config_name: "km"
data_files:
- split: train
path: km/*
- config_name: "kn"
data_files:
- split: train
path: kn/*
- config_name: "ko"
data_files:
- split: train
path: ko/*
- config_name: "koi"
data_files:
- split: train
path: koi/*
- config_name: "krc"
data_files:
- split: train
path: krc/*
- config_name: "ks"
data_files:
- split: train
path: ks/*
- config_name: "ksh"
data_files:
- split: train
path: ksh/*
- config_name: "ku"
data_files:
- split: train
path: ku/*
- config_name: "kv"
data_files:
- split: train
path: kv/*
- config_name: "kw"
data_files:
- split: train
path: kw/*
- config_name: "ky"
data_files:
- split: train
path: ky/*
- config_name: "la"
data_files:
- split: train
path: la/*
- config_name: "lad"
data_files:
- split: train
path: lad/*
- config_name: "lb"
data_files:
- split: train
path: lb/*
- config_name: "lbe"
data_files:
- split: train
path: lbe/*
- config_name: "lez"
data_files:
- split: train
path: lez/*
- config_name: "lfn"
data_files:
- split: train
path: lfn/*
- config_name: "lg"
data_files:
- split: train
path: lg/*
- config_name: "li"
data_files:
- split: train
path: li/*
- config_name: "lij"
data_files:
- split: train
path: lij/*
- config_name: "lld"
data_files:
- split: train
path: lld/*
- config_name: "lmo"
data_files:
- split: train
path: lmo/*
- config_name: "ln"
data_files:
- split: train
path: ln/*
- config_name: "lo"
data_files:
- split: train
path: lo/*
- config_name: "lt"
data_files:
- split: train
path: lt/*
- config_name: "ltg"
data_files:
- split: train
path: ltg/*
- config_name: "lv"
data_files:
- split: train
path: lv/*
- config_name: "mad"
data_files:
- split: train
path: mad/*
- config_name: "mai"
data_files:
- split: train
path: mai/*
- config_name: "map-bms"
data_files:
- split: train
path: map-bms/*
- config_name: "mdf"
data_files:
- split: train
path: mdf/*
- config_name: "mg"
data_files:
- split: train
path: mg/*
- config_name: "mhr"
data_files:
- split: train
path: mhr/*
- config_name: "mi"
data_files:
- split: train
path: mi/*
- config_name: "min"
data_files:
- split: train
path: min/*
- config_name: "mk"
data_files:
- split: train
path: mk/*
- config_name: "ml"
data_files:
- split: train
path: ml/*
- config_name: "mn"
data_files:
- split: train
path: mn/*
- config_name: "mni"
data_files:
- split: train
path: mni/*
- config_name: "mnw"
data_files:
- split: train
path: mnw/*
- config_name: "mr"
data_files:
- split: train
path: mr/*
- config_name: "mrj"
data_files:
- split: train
path: mrj/*
- config_name: "ms"
data_files:
- split: train
path: ms/*
- config_name: "mt"
data_files:
- split: train
path: mt/*
- config_name: "mwl"
data_files:
- split: train
path: mwl/*
- config_name: "my"
data_files:
- split: train
path: my/*
- config_name: "myv"
data_files:
- split: train
path: myv/*
- config_name: "mzn"
data_files:
- split: train
path: mzn/*
- config_name: "nah"
data_files:
- split: train
path: nah/*
- config_name: "nap"
data_files:
- split: train
path: nap/*
- config_name: "nds"
data_files:
- split: train
path: nds/*
- config_name: "nds-nl"
data_files:
- split: train
path: nds-nl/*
- config_name: "ne"
data_files:
- split: train
path: ne/*
- config_name: "new"
data_files:
- split: train
path: new/*
- config_name: "nia"
data_files:
- split: train
path: nia/*
- config_name: "nl"
data_files:
- split: train
path: nl/*
- config_name: "nn"
data_files:
- split: train
path: nn/*
- config_name: "no"
data_files:
- split: train
path: no/*
- config_name: "nov"
data_files:
- split: train
path: nov/*
- config_name: "nqo"
data_files:
- split: train
path: nqo/*
- config_name: "nrm"
data_files:
- split: train
path: nrm/*
- config_name: "nso"
data_files:
- split: train
path: nso/*
- config_name: "nv"
data_files:
- split: train
path: nv/*
- config_name: "ny"
data_files:
- split: train
path: ny/*
- config_name: "oc"
data_files:
- split: train
path: oc/*
- config_name: "olo"
data_files:
- split: train
path: olo/*
- config_name: "om"
data_files:
- split: train
path: om/*
- config_name: "or"
data_files:
- split: train
path: or/*
- config_name: "os"
data_files:
- split: train
path: os/*
- config_name: "pa"
data_files:
- split: train
path: pa/*
- config_name: "pag"
data_files:
- split: train
path: pag/*
- config_name: "pam"
data_files:
- split: train
path: pam/*
- config_name: "pap"
data_files:
- split: train
path: pap/*
- config_name: "pcd"
data_files:
- split: train
path: pcd/*
- config_name: "pcm"
data_files:
- split: train
path: pcm/*
- config_name: "pdc"
data_files:
- split: train
path: pdc/*
- config_name: "pfl"
data_files:
- split: train
path: pfl/*
- config_name: "pi"
data_files:
- split: train
path: pi/*
- config_name: "pih"
data_files:
- split: train
path: pih/*
- config_name: "pl"
data_files:
- split: train
path: pl/*
- config_name: "pms"
data_files:
- split: train
path: pms/*
- config_name: "pnb"
data_files:
- split: train
path: pnb/*
- config_name: "pnt"
data_files:
- split: train
path: pnt/*
- config_name: "ps"
data_files:
- split: train
path: ps/*
- config_name: "pt"
data_files:
- split: train
path: pt/*
- config_name: "pwn"
data_files:
- split: train
path: pwn/*
- config_name: "qu"
data_files:
- split: train
path: qu/*
- config_name: "rm"
data_files:
- split: train
path: rm/*
- config_name: "rmy"
data_files:
- split: train
path: rmy/*
- config_name: "rn"
data_files:
- split: train
path: rn/*
- config_name: "ro"
data_files:
- split: train
path: ro/*
- config_name: "roa-rup"
data_files:
- split: train
path: roa-rup/*
- config_name: "roa-tara"
data_files:
- split: train
path: roa-tara/*
- config_name: "ru"
data_files:
- split: train
path: ru/*
- config_name: "rue"
data_files:
- split: train
path: rue/*
- config_name: "rw"
data_files:
- split: train
path: rw/*
- config_name: "sa"
data_files:
- split: train
path: sa/*
- config_name: "sah"
data_files:
- split: train
path: sah/*
- config_name: "sat"
data_files:
- split: train
path: sat/*
- config_name: "sc"
data_files:
- split: train
path: sc/*
- config_name: "scn"
data_files:
- split: train
path: scn/*
- config_name: "sco"
data_files:
- split: train
path: sco/*
- config_name: "sd"
data_files:
- split: train
path: sd/*
- config_name: "se"
data_files:
- split: train
path: se/*
- config_name: "sg"
data_files:
- split: train
path: sg/*
- config_name: "sh"
data_files:
- split: train
path: sh/*
- config_name: "shi"
data_files:
- split: train
path: shi/*
- config_name: "shn"
data_files:
- split: train
path: shn/*
- config_name: "si"
data_files:
- split: train
path: si/*
- config_name: "simple"
data_files:
- split: train
path: simple/*
- config_name: "sk"
data_files:
- split: train
path: sk/*
- config_name: "skr"
data_files:
- split: train
path: skr/*
- config_name: "sl"
data_files:
- split: train
path: sl/*
- config_name: "sm"
data_files:
- split: train
path: sm/*
- config_name: "smn"
data_files:
- split: train
path: smn/*
- config_name: "sn"
data_files:
- split: train
path: sn/*
- config_name: "so"
data_files:
- split: train
path: so/*
- config_name: "sq"
data_files:
- split: train
path: sq/*
- config_name: "sr"
data_files:
- split: train
path: sr/*
- config_name: "srn"
data_files:
- split: train
path: srn/*
- config_name: "ss"
data_files:
- split: train
path: ss/*
- config_name: "st"
data_files:
- split: train
path: st/*
- config_name: "stq"
data_files:
- split: train
path: stq/*
- config_name: "su"
data_files:
- split: train
path: su/*
- config_name: "sv"
data_files:
- split: train
path: sv/*
- config_name: "sw"
data_files:
- split: train
path: sw/*
- config_name: "szl"
data_files:
- split: train
path: szl/*
- config_name: "szy"
data_files:
- split: train
path: szy/*
- config_name: "ta"
data_files:
- split: train
path: ta/*
- config_name: "tay"
data_files:
- split: train
path: tay/*
- config_name: "tcy"
data_files:
- split: train
path: tcy/*
- config_name: "te"
data_files:
- split: train
path: te/*
- config_name: "tet"
data_files:
- split: train
path: tet/*
- config_name: "tg"
data_files:
- split: train
path: tg/*
- config_name: "th"
data_files:
- split: train
path: th/*
- config_name: "ti"
data_files:
- split: train
path: ti/*
- config_name: "tk"
data_files:
- split: train
path: tk/*
- config_name: "tl"
data_files:
- split: train
path: tl/*
- config_name: "tly"
data_files:
- split: train
path: tly/*
- config_name: "tn"
data_files:
- split: train
path: tn/*
- config_name: "to"
data_files:
- split: train
path: to/*
- config_name: "tpi"
data_files:
- split: train
path: tpi/*
- config_name: "tr"
data_files:
- split: train
path: tr/*
- config_name: "trv"
data_files:
- split: train
path: trv/*
- config_name: "ts"
data_files:
- split: train
path: ts/*
- config_name: "tt"
data_files:
- split: train
path: tt/*
- config_name: "tum"
data_files:
- split: train
path: tum/*
- config_name: "tw"
data_files:
- split: train
path: tw/*
- config_name: "ty"
data_files:
- split: train
path: ty/*
- config_name: "tyv"
data_files:
- split: train
path: tyv/*
- config_name: "udm"
data_files:
- split: train
path: udm/*
- config_name: "ug"
data_files:
- split: train
path: ug/*
- config_name: "uk"
data_files:
- split: train
path: uk/*
- config_name: "ur"
data_files:
- split: train
path: ur/*
- config_name: "uz"
data_files:
- split: train
path: uz/*
- config_name: "ve"
data_files:
- split: train
path: ve/*
- config_name: "vec"
data_files:
- split: train
path: vec/*
- config_name: "vep"
data_files:
- split: train
path: vep/*
- config_name: "vi"
data_files:
- split: train
path: vi/*
- config_name: "vls"
data_files:
- split: train
path: vls/*
- config_name: "vo"
data_files:
- split: train
path: vo/*
- config_name: "wa"
data_files:
- split: train
path: wa/*
- config_name: "war"
data_files:
- split: train
path: war/*
- config_name: "wo"
data_files:
- split: train
path: wo/*
- config_name: "wuu"
data_files:
- split: train
path: wuu/*
- config_name: "xal"
data_files:
- split: train
path: xal/*
- config_name: "xh"
data_files:
- split: train
path: xh/*
- config_name: "xmf"
data_files:
- split: train
path: xmf/*
- config_name: "yi"
data_files:
- split: train
path: yi/*
- config_name: "yo"
data_files:
- split: train
path: yo/*
- config_name: "za"
data_files:
- split: train
path: za/*
- config_name: "zea"
data_files:
- split: train
path: zea/*
- config_name: "zh"
data_files:
- split: train
path: zh/*
- config_name: "zh-classical"
data_files:
- split: train
path: zh-classical/*
- config_name: "zh-min-nan"
data_files:
- split: train
path: zh-min-nan/*
- config_name: "zh-yue"
data_files:
- split: train
path: zh-yue/*
- config_name: "zu"
data_files:
- split: train
path: zu/*
---
# Multilingual Embeddings for Wikipedia in 300+ Languages (int8 & binary embeddings)
This dataset contains the [wikimedia/wikipedia](https://huggingface.co/datasets/wikimedia/wikipedia) dataset dump from 2023-11-01 from Wikipedia in all 300+ languages. The embeddings are provided as **int8** and **ubinary** that allow quick search and reduction of your vector index size up to 32. For more details, see [Cohere int8 & binary Embeddings](https://txt.cohere.com/int8-binary-embeddings/)

The individual articles have been chunked and embedded with the state-of-the-art multilingual [Cohere Embed V3 embedding model](https://txt.cohere.com/introducing-embed-v3/). This enables an easy way to semantically search across all of Wikipedia or to use it as a knowledge source for your RAG application. In total is it close to 250M paragraphs / embeddings.
You can also use the model to perform cross-lingual search: Enter your search query in any language and get the most relevant results back.
## Loading the dataset
### Loading the document embeddings
You can either load the dataset like this:
```python
from datasets import load_dataset
lang = "simple" #Use the Simple English Wikipedia subset
docs = load_dataset("Cohere/wikipedia-2023-11-embed-multilingual-v3-int8-binary", lang, split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
lang = "simple" #Use the Simple English Wikipedia subset
docs = load_dataset("Cohere/wikipedia-2023-11-embed-multilingual-v3-int8-binary", lang, split="train", streaming=True)
for doc in docs:
doc_id = doc['_id']
title = doc['title']
text = doc['text']
emb_int8 = doc['emb_int8'] #Int8 embeddings
emb_ubinary = doc['emb_ubinary'] #(unsigned) binary embeddings
```
Note, depending on the language, the download can be quite large.
## Search - int8 embeddings
int8 embeddings keep 99.7-100% of the search quality, while reducing your vector index 4x and speeding-up vector search up to 30%.
Below example using [usearch](https://github.com/unum-cloud/usearch) to search on int8 embeddings.
```python
from datasets import load_dataset
import numpy as np
import cohere
from usearch.index import Index
co = cohere.Client("<<YOUR_API_KEY>>") # Add your cohere API key from www.cohere.com
lang = "simple"
#Load at max 1000 chunks + embeddings
max_docs = 1000
docs_stream = load_dataset(f"Cohere/wikipedia-2023-11-embed-multilingual-v3-int8-binary", lang, split="train", streaming=True)
docs = []
doc_embeddings = []
for doc in docs_stream:
docs.append(doc)
doc_embeddings.append(doc['emb_int8'])
if len(docs) >= max_docs:
break
doc_embeddings = np.asarray(doc_embeddings, dtype='int8')
print(doc_embeddings.shape, doc_embeddings.dtype)
#Create the usearch HNSW index on the int8 embeddings
num_dim = 1024
index = Index(ndim=1024, metric='ip', dtype='i8')
index.add(np.arange(len(doc_embeddings)), doc_embeddings)
#Searching
query = 'Who was Alan Turing'
query_emb = co.embed(texts=[query], model="embed-multilingual-v3.0", input_type="search_query", embedding_types=["int8"]).embeddings
query_emb = np.asarray(query_emb.int8, dtype='int8')
#Search on the index and get top-10 results
matches = index.search(query_emb, 10)
# Print results
for match in matches:
doc_id = match.key
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
print("----")
```
## Search - Binary embeddings
Binary embeddings keep 95% of the search quality, while reducing your vector index 32x and speeding-up vector search up to 40x.
Below example using [faiss](https://github.com/facebookresearch/faiss) to search on binary embeddings.
```python
#Run: pip install cohere datasets numpy faiss-cpu
from datasets import load_dataset
import numpy as np
import cohere
import faiss
co = cohere.Client("<<YOUR_API_KEY>>") # Add your cohere API key from www.cohere.com
lang = "simple"
#Load at max 1000 chunks + embeddings
max_docs = 1000
docs_stream = load_dataset(f"Cohere/wikipedia-2023-11-embed-multilingual-v3-int8-binary", lang, split="train", streaming=True)
docs = []
doc_embeddings = []
for doc in docs_stream:
docs.append(doc)
doc_embeddings.append(doc['emb_ubinary'])
if len(docs) >= max_docs:
break
doc_embeddings = np.asarray(doc_embeddings, dtype='uint8')
#Create the faiss IndexBinaryFlat index
num_dim = 1024
index = faiss.IndexBinaryFlat(num_dim)
index.add(doc_embeddings)
#Search
def search(index, query, top_k=10):
# Make sure to set input_type="search_query"
query_emb = co.embed(texts=[query], model="embed-multilingual-v3.0", input_type="search_query", embedding_types=["ubinary", "float"]).embeddings
query_emb_bin = np.asarray(query_emb.ubinary, dtype='uint8')
query_emb_float = np.asarray(query_emb.float, dtype="float32")
# Phase I: Search on the index with a binary
hits_scores, hits_doc_ids = index.search(query_emb_bin, k=min(10*top_k, index.ntotal))
#Get the results in a list of hits
hits = [{'doc_id': doc_id.item(), 'score_bin': score_bin} for doc_id, score_bin in zip(hits_doc_ids[0], hits_scores[0])]
# Phase II: Do a re-scoring with the float query embedding
binary_doc_emb = np.asarray([index.reconstruct(hit['doc_id']) for hit in hits])
binary_doc_emb_unpacked = np.unpackbits(binary_doc_emb, axis=-1).astype("int")
binary_doc_emb_unpacked = 2*binary_doc_emb_unpacked-1
scores_cont = (query_emb_float[0] @ binary_doc_emb_unpacked.T)
for idx in range(len(scores_cont)):
hits[idx]['score_cont'] = scores_cont[idx]
#Sort by largest score_cont
hits.sort(key=lambda x: x['score_cont'], reverse=True)
return hits[0:top_k]
query = 'Who was Alan Turing'
hits = search(index, query)
# Print results
print("Query:", query)
for hit in hits:
doc_id = hit['doc_id']
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
print(docs[doc_id]['url'], "\n")
```
## Overview
The following table contains all language codes together with the total numbers of passages.
| Language | #Docs |
|---|:---:|
| en | 41,488,110 |
| de | 20,772,081 |
| fr | 17,813,768 |
| ru | 13,734,543 |
| es | 12,905,284 |
| it | 10,462,162 |
| ceb | 9,818,657 |
| uk | 6,901,192 |
| ja | 6,626,537 |
| nl | 6,101,353 |
| pl | 5,973,650 |
| pt | 5,637,930 |
| sv | 4,911,480 |
| ca | 4,156,889 |
| ar | 3,691,451 |
| cs | 3,118,524 |
| he | 2,948,882 |
| hu | 2,924,609 |
| vi | 2,835,049 |
| zh | 2,775,260 |
| fi | 2,427,097 |
| id | 2,358,335 |
| no | 2,211,270 |
| sr | 2,154,631 |
| fa | 2,073,154 |
| tr | 1,775,036 |
| ro | 1,770,527 |
| el | 1,599,770 |
| ko | 1,513,291 |
| bg | 1,455,765 |
| hy | 1,386,140 |
| eu | 1,327,579 |
| da | 1,224,982 |
| eo | 1,216,706 |
| war | 1,185,097 |
| sh | 1,139,309 |
| tt | 1,119,546 |
| arz | 1,089,164 |
| gl | 1,056,990 |
| et | 1,054,770 |
| ce | 1,013,217 |
| ast | 1,010,445 |
| sl | 984,855 |
| hr | 910,923 |
| sk | 874,014 |
| ms | 869,579 |
| be | 857,033 |
| th | 839,712 |
| az | 817,631 |
| uz | 811,028 |
| mk | 784,576 |
| lt | 770,069 |
| bn | 767,965 |
| cy | 762,338 |
| ta | 685,975 |
| simple | 646,424 |
| te | 634,778 |
| kk | 627,085 |
| ka | 595,401 |
| hi | 541,822 |
| nn | 530,590 |
| lv | 484,957 |
| af | 461,674 |
| ba | 434,939 |
| ur | 434,269 |
| bs | 396,692 |
| sq | 388,788 |
| ml | 384,795 |
| min | 373,156 |
| la | 340,521 |
| pnb | 335,958 |
| be-x-old | 314,600 |
| kn | 309,085 |
| azb | 294,248 |
| oc | 283,297 |
| zh-min-nan | 278,547 |
| fy | 248,075 |
| my | 241,025 |
| lb | 216,558 |
| ky | 216,344 |
| als | 206,387 |
| mr | 203,479 |
| br | 200,486 |
| pa | 188,091 |
| is | 177,272 |
| mg | 171,947 |
| sw | 171,650 |
| ha | 167,807 |
| tl | 166,907 |
| nds | 166,019 |
| an | 143,163 |
| jv | 142,104 |
| ps | 138,240 |
| ig | 132,250 |
| new | 128,696 |
| tg | 128,237 |
| ga | 125,456 |
| lld | 125,094 |
| su | 124,390 |
| cv | 122,671 |
| ckb | 120,886 |
| si | 119,223 |
| mn | 114,878 |
| lmo | 103,836 |
| io | 101,713 |
| gu | 99,450 |
| vec | 95,072 |
| zh-yue | 89,145 |
| bar | 88,238 |
| sco | 83,906 |
| ne | 83,598 |
| ku | 82,935 |
| hyw | 82,343 |
| pms | 77,834 |
| as | 76,093 |
| km | 74,177 |
| sah | 71,599 |
| li | 69,267 |
| or | 65,510 |
| mt | 64,038 |
| szl | 56,836 |
| yi | 55,375 |
| ht | 55,079 |
| dag | 53,343 |
| sa | 51,735 |
| nv | 49,355 |
| bpy | 47,757 |
| vo | 47,375 |
| ug | 44,764 |
| sat | 43,500 |
| ia | 42,012 |
| bo | 41,438 |
| mwl | 41,273 |
| sd | 40,395 |
| bcl | 39,967 |
| mnw | 39,578 |
| hsb | 39,560 |
| avk | 39,001 |
| scn | 38,359 |
| rm | 37,436 |
| diq | 34,743 |
| vep | 33,654 |
| xmf | 33,238 |
| ban | 32,320 |
| wa | 32,132 |
| ilo | 31,046 |
| nds-nl | 30,918 |
| qu | 30,529 |
| so | 29,936 |
| mhr | 29,619 |
| vls | 29,227 |
| sc | 28,977 |
| fo | 28,809 |
| gd | 28,149 |
| rw | 28,037 |
| gom | 27,792 |
| yo | 27,789 |
| tum | 26,743 |
| wuu | 26,532 |
| frr | 26,010 |
| sn | 25,941 |
| tk | 24,269 |
| blk | 24,194 |
| mzn | 23,837 |
| co | 23,065 |
| szy | 22,854 |
| am | 22,467 |
| shn | 22,432 |
| skr | 21,081 |
| lfn | 20,781 |
| tyv | 20,762 |
| lij | 20,553 |
| ie | 19,994 |
| rue | 19,916 |
| crh | 19,016 |
| gor | 18,146 |
| ary | 17,463 |
| dv | 16,941 |
| lg | 16,751 |
| roa-tara | 16,572 |
| bjn | 16,429 |
| tw | 16,304 |
| bh | 15,938 |
| pam | 15,134 |
| os | 15,096 |
| myv | 15,062 |
| gn | 14,983 |
| lez | 14,152 |
| mai | 13,806 |
| kv | 13,534 |
| pcd | 13,057 |
| zh-classical | 12,791 |
| zea | 12,528 |
| lo | 12,525 |
| gv | 12,074 |
| stq | 11,890 |
| zu | 11,680 |
| smn | 11,672 |
| kw | 11,539 |
| bat-smg | 11,240 |
| hif | 11,215 |
| ext | 10,967 |
| ace | 10,821 |
| trv | 10,546 |
| ami | 10,538 |
| tcy | 10,531 |
| lad | 10,386 |
| alt | 10,256 |
| pap | 10,187 |
| kab | 10,179 |
| fur | 10,148 |
| nap | 10,079 |
| mrj | 9,771 |
| kaa | 9,548 |
| nqo | 9,153 |
| glk | 9,120 |
| pfl | 8,790 |
| fiu-vro | 8,757 |
| nso | 8,635 |
| jbo | 8,577 |
| bxr | 8,549 |
| wo | 8,549 |
| olo | 8,530 |
| map-bms | 8,393 |
| ksh | 8,226 |
| csb | 8,085 |
| av | 7,873 |
| mni | 7,740 |
| udm | 7,730 |
| mi | 7,643 |
| kbp | 7,616 |
| dsb | 7,536 |
| frp | 7,294 |
| om | 7,045 |
| ang | 7,023 |
| hak | 6,866 |
| gur | 6,761 |
| se | 6,733 |
| anp | 6,704 |
| tay | 6,434 |
| mdf | 6,351 |
| gcr | 6,347 |
| koi | 6,300 |
| krc | 6,293 |
| ay | 5,985 |
| cdo | 5,917 |
| nrm | 5,786 |
| xh | 5,756 |
| tn | 5,712 |
| tly | 5,598 |
| shi | 5,179 |
| pcm | 5,076 |
| fat | 4,968 |
| nia | 4,795 |
| dty | 4,728 |
| kbd | 4,667 |
| gpe | 4,289 |
| cbk-zam | 4,224 |
| ff | 4,166 |
| dz | 4,117 |
| guw | 3,982 |
| eml | 3,979 |
| ln | 3,774 |
| inh | 3,768 |
| nah | 3,720 |
| ab | 3,465 |
| ks | 3,255 |
| mad | 3,236 |
| haw | 3,227 |
| gag | 3,076 |
| tet | 3,030 |
| ny | 2,933 |
| pag | 2,727 |
| guc | 2,454 |
| roa-rup | 2,409 |
| jam | 2,387 |
| awa | 2,242 |
| pdc | 2,239 |
| to | 2,165 |
| za | 2,132 |
| st | 2,051 |
| ltg | 2,005 |
| atj | 1,967 |
| nov | 1,916 |
| ss | 1,904 |
| pwn | 1,881 |
| ee | 1,819 |
| sm | 1,659 |
| ts | 1,645 |
| gan | 1,626 |
| xal | 1,619 |
| kcg | 1,555 |
| cu | 1,477 |
| srn | 1,395 |
| got | 1,280 |
| fon | 1,247 |
| din | 1,214 |
| arc | 1,167 |
| fj | 1,164 |
| rmy | 1,113 |
| ady | 1,040 |
| rn | 1,033 |
| bm | 1,017 |
| tpi | 957 |
| ve | 919 |
| ki | 798 |
| pnt | 796 |
| chr | 788 |
| kl | 770 |
| lbe | 766 |
| bi | 718 |
| ti | 706 |
| kg | 609 |
| pih | 606 |
| ch | 513 |
| bug | 429 |emb
| ty | 297 |
| ik | 275 |
| iu | 263 |
| pi | 260 |
| sg | 204 |
| chy | 57 |
| cr | 41 |
| Total | 247,154,006 |
|
charlieoneill/resid-streams | ---
dataset_info:
features:
- name: resid_streams
dtype: int64
splits:
- name: train
num_bytes: 800
num_examples: 100
download_size: 1406
dataset_size: 800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-zeroshot__twitter-financial-news-topic-zeroshot__twitte-178919-28982144928 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- zeroshot/twitter-financial-news-topic
eval_info:
task: summarization
model: phpaiola/ptt5-base-summ-temario
metrics: ['bertscore']
dataset_name: zeroshot/twitter-financial-news-topic
dataset_config: zeroshot--twitter-financial-news-topic
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: phpaiola/ptt5-base-summ-temario
* Dataset: zeroshot/twitter-financial-news-topic
* Config: zeroshot--twitter-financial-news-topic
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@peterdevathala](https://huggingface.co/peterdevathala) for evaluating this model. |
kqsong/OASum | ---
license: cc-by-sa-3.0
language:
- en
tags:
- summarization
- Wikipedia
size_categories:
- 1M<n<10M
task_categories:
- summarization
---
# Dataset Card for OASum Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Usage](#dataset-usage)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Repository:** [OASum Dataset repository](https://github.com/tencent-ailab/OASum)
- **Paper:** [OASum: Large-Scale Open Domain Aspect-based Summarization](https://arxiv.org/pdf/2212.09233.pdf)
The OASum Dataset is an English-language dataset containing over 3.6M document, aspect, and summary triplets.
## Dataset Usage
You can directly download it with huggingface datasets.
``` python
from datasets import load_dataset
dataset = load_dataset("kqsong/OASum")
```
## Dataset Structure
### Data Instances
For each instance, there is a list of strings for the document, a list of strings for the summary, a string for the document title, a string for the aspect and a list of indices for the sentences in the corresponding section.
```json
{
"title": "Ker's WingHouse Bar & Grill",
"document":[
"After Clearwater, Florida chicken wing pioneering restaurant chain Hooters began rapidly expanding, Florida based, Canadian-born restaurant entrepreneur Ed Burnett saw the opportunity.",
"Burnett secured the rights to a closed restaurant (\"Knockers\") and opened \"The WingHouse\" restaurant at 7369 Ulmerton Road, Largo, Florida, a high traffic corridor.",
"He strategically selected the restaurant in between where people work (commercial real estate) and live (residential real estate), to appeal to the local lunch crowd and family dining crowd.",
"This flagship location proved to be a success soon after launching and is the model that the chain expanded on.",
"Burnett, looking to expand to additional locations, accepted a financing partner (Crawford Ker) during this time frame, to open additional locations and beyond.",
"Burnett's goal was to open 20 to 50 locations, and then sell the chain to a larger restaurant chain or investors.",
"Burnett would ultimately regret his choice of investor.","In 1992, Ker retired from the NFL and took a job selling cars at a local dealer.",
"In 1994, he invested half interest in a Largo, Florida wing restaurant called, \"Wing House\" that imitated Hooters.",
"The restaurant was always The Wing House, and the atmosphere was always toned down to make it more family friendly.",
"The restaurant did well and two additional locations were opened in the Tampa Bay area in the following three years.",
"Ker won a $1.2-million jury award from Hooters in late 2004, which had sued him for trademark violations for allegedly using their uniforms and decor.",
"After a three-week trial in which lawyers discussed hula hoops, surfboards, scrunchy socks, pantyhose, and something called \"vicarious sexual recreation\", the jury ruled that no trademark infringement existed and Hooters was penalized for their frivolous lawsuit.",
"Hooters appealed the decision, but in June, 2006, the 11th U.S. Circuit Court of Appeals in Atlanta upheld the verdict.",
"As of 2007, the company had 1,700 employees at 22 locations with revenue of nearly $60 million.",
"Ker attended, and the company participated in, the 2007 National Buffalo Wing Festival and placed first in the \"traditional x-hot sauce\" category and gained some national recognition.",
"On June 4, 2008 the company announced the launch of its national franchise program.",
"In mid-2008 the chain operated 19 locations in Florida and Texas and expected to add six franchises by the end of 2008, and 48 by 2011.",
"The initial focus was for franchises in the Southeastern US.",
"WingHouses feature several amenities that differ from other wing restaurants, including Hooters.",
"There is a full liquor bar in every store, sports memorabilia line the walls instead of NASCAR and most locations include a game room.",
"Super Bowl XLIII in Tampa, Florida attracted the rich and famous; WingHouse hosted three events to raise money for charity."
],
"aspect": "Opening",
"aspect_sents": [0,1,2,3,4,5,6,7,8,9,10],
"summary":[
"WingHouse Bar & Grill (formerly Ker\u2019s WingHouse Bar & Grill) is a restaurant chain based in Florida, created and founded by Ed Burnett, a Canadian restaurant entrepreneur.",
"After opening his first WingHouse location, Burnett sought out investors to open additional WingHouse locations.",
"Burnett accepted investor Crawford Ker (a former National Football League player) to assist financing the expansion."
]
}
```
The average token count for the articles and the highlights are provided below:
| Feature | Mean Token Count |
| ---------- | ---------------- |
| Document | 1,612 |
| Summary | 40 |
### Data Fields
- `title`: a string, containing the original Wikipedia title.
- `document`: a list of sentences, containing the original content in the Wikipedia sections except the first abstract section.
- `aspect`: a string, containing the section name and its parent section names.
- `aspect_sents`: a list of indices, representing the sentences in the `aspect` section.
- `summary`: a list of sentences, the corresponding aspect-based summary for the document.
### Data Splits
The OASum dataset has 3 splits: _train_, _valid_, and _test_. Below are the statistics for the Version 1.0.0 of the dataset.
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 3,523,986 |
| Validation | 111,578 |
| Test | 112,005 |
## Additional Information
### Licensing Information
The OASum Dataset version 1.0.0 is released under the [CC-BY-SA-3.0 License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)
### Citation Information
```
@article{yang2022oasum,
title={Oasum: Large-scale open domain aspect-based summarization},
author={Yang, Xianjun and Song, Kaiqiang and Cho, Sangwoo and Wang, Xiaoyang and Pan, Xiaoman and Petzold, Linda and Yu, Dong},
journal={arXiv preprint arXiv:2212.09233},
year={2022}
}
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.