datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
faizalnf1800/scifi-webnovel | ---
license: mit
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/52a401f3 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1333
dataset_size: 178
---
# Dataset Card for "52a401f3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713099772 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22155
num_examples: 50
download_size: 13120
dataset_size: 22155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/diantha_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of diantha/ディアンサ (Granblue Fantasy)
This is the dataset of diantha/ディアンサ (Granblue Fantasy), containing 60 images and their tags.
The core tags of this character are `brown_hair, long_hair, breasts, side_ponytail, brown_eyes, hair_ornament, medium_breasts, ahoge, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 60 | 74.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 60 | 49.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 139 | 100.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 60 | 69.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 139 | 128.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/diantha_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/diantha_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, simple_background, smile, solo, open_mouth, bracelet, bangs, short_sleeves, white_background, blush, boots, full_body, idol, holding_microphone, short_dress |
| 1 | 11 |  |  |  |  |  | bikini, hair_flower, looking_at_viewer, smile, 1girl, cleavage, navel, bracelet, open_mouth, solo, skirt, blush, choker, large_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | simple_background | smile | solo | open_mouth | bracelet | bangs | short_sleeves | white_background | blush | boots | full_body | idol | holding_microphone | short_dress | bikini | hair_flower | navel | skirt | choker | large_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:--------------------|:--------|:-------|:-------------|:-----------|:--------|:----------------|:-------------------|:--------|:--------|:------------|:-------|:---------------------|:--------------|:---------|:--------------|:--------|:--------|:---------|:----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | | X | X | X | X | | | | X | | | | | | X | X | X | X | X | X |
|
nullzero-live/midjourney-sentiment | ---
license: openrail
---
130k midjourney prompts and their evaluated sentiment using `NLTK` library and the "Opinion Mining" positive/negative words library.
https://www.cs.uic.edu/~liub/FBS/sentiment-analysis.html
|
bigscience-data/roots_indic-gu_indic_nlp_corpus | ---
language: gu
license: cc-by-nc-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-gu_indic_nlp_corpus
# Indic NLP Corpus
- Dataset uid: `indic_nlp_corpus`
### Description
The IndicNLP corpus is a largescale, general-domain corpus containing 2.7 billion words for 10 Indian languages from two language families. s (IndoAryan branch and Dravidian). Each language has at least 100 million words (except Oriya).
### Homepage
https://github.com/AI4Bharat/indicnlp_corpus#publicly-available-classification-datasets
### Licensing
- non-commercial use
- cc-by-nc-sa-4.0: Creative Commons Attribution Non Commercial Share Alike 4.0 International
### Speaker Locations
- Southern Asia
- India
### Sizes
- 3.4019 % of total
- 44.4368 % of indic-hi
- 64.2943 % of indic-ta
- 70.5374 % of indic-ml
- 54.2394 % of indic-te
- 55.9105 % of indic-kn
- 61.6111 % of indic-mr
- 67.2242 % of indic-pa
- 68.1470 % of indic-or
- 64.3879 % of indic-gu
- 4.1495 % of indic-bn
### BigScience processing steps
#### Filters applied to: indic-hi
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: indic-gu
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
Teklia/HOME-Alcar-line | ---
license: mit
language:
- la
task_categories:
- image-to-text
pretty_name: HOME-Alcar-line
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_examples: 59969
- name: validation
num_examples: 7905
- name: test
num_examples: 6932
dataset_size: 74806
tags:
- atr
- htr
- ocr
- historical
- handwritten
---
# HOME-Alcar - line level
## Table of Contents
- [HOME-Alcar - line level](#home-alcar-line-level)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
## Dataset Description
- **Homepage:** [HOME](https://www.heritageresearch-hub.eu/project/home/)
- **Source:** [Arkindex](https://demo.arkindex.org/browse/46b9b1f4-baeb-4342-a501-e2f15472a276?top_level=true&folder=true)
- **Point of Contact:** [TEKLIA](https://teklia.com)
## Dataset Summary
The HOME-Alcar (Aligned and Annotated Cartularies) dataset is a Medieval corpus. The 17 medieval manuscripts in this corpus are cartularies, i.e. books copying charters and legal acts, produced between the 12th and 14th centuries.
Note that all images are resized to a fixed height of 128 pixels.
### Languages
All the documents in the dataset are written in Latin.
## Dataset Structure
### Data Instances
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=4300x128 at 0x1A800E8E190,
'text': 'quatre mille livres de tournoiz poiez, si com¬'
}
```
### Data Fields
- `image`: a PIL.Image.Image object containing the image. Note that when accessing the image column (using dataset[0]["image"]), the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
- `text`: the label transcription of the image. |
swarnavoroopya/demo | ---
license: mit
---
|
open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-1 | ---
pretty_name: Evaluation run of Josephgflowers/Tinyllama-1.5B-Cinder-Test-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/Tinyllama-1.5B-Cinder-Test-1](https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-04T19:13:35.456648](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-1/blob/main/results_2024-04-04T19-13-35.456648.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2551872186742336,\n\
\ \"acc_stderr\": 0.030720839714441703,\n \"acc_norm\": 0.2564006611007971,\n\
\ \"acc_norm_stderr\": 0.03153950595899025,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4054758035159032,\n\
\ \"mc2_stderr\": 0.014782479763462388\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.26791808873720135,\n \"acc_stderr\": 0.012942030195136435,\n\
\ \"acc_norm\": 0.31313993174061433,\n \"acc_norm_stderr\": 0.013552671543623497\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3657637920732922,\n\
\ \"acc_stderr\": 0.004806593424942259,\n \"acc_norm\": 0.4523999203345947,\n\
\ \"acc_norm_stderr\": 0.00496711857590529\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051996,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051996\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237653,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234085,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234085\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400158,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400158\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.0234009289183105,\n \
\ \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.0234009289183105\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926764,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926764\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882385,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882385\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501954,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21076233183856502,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.21076233183856502,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824848,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824848\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.02723601394619668,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.02723601394619668\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.25287356321839083,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.19653179190751446,\n \"acc_stderr\": 0.021393961404363847,\n\
\ \"acc_norm\": 0.19653179190751446,\n \"acc_norm_stderr\": 0.021393961404363847\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961452,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20915032679738563,\n \"acc_stderr\": 0.023287685312334813,\n\
\ \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.023287685312334813\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02378858355165853,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02378858355165853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.02624492034984301,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.02624492034984301\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478949,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478949\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335307,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02892058322067561,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02892058322067561\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.03096590312357305,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.03096590312357305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.4054758035159032,\n\
\ \"mc2_stderr\": 0.014782479763462388\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5706393054459353,\n \"acc_stderr\": 0.013911537499969165\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|arc:challenge|25_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|gsm8k|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hellaswag|10_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T19-13-35.456648.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-04T19-13-35.456648.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- '**/details_harness|winogrande|5_2024-04-04T19-13-35.456648.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-04T19-13-35.456648.parquet'
- config_name: results
data_files:
- split: 2024_04_04T19_13_35.456648
path:
- results_2024-04-04T19-13-35.456648.parquet
- split: latest
path:
- results_2024-04-04T19-13-35.456648.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.5B-Cinder-Test-1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.5B-Cinder-Test-1](https://huggingface.co/Josephgflowers/Tinyllama-1.5B-Cinder-Test-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-04T19:13:35.456648](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.5B-Cinder-Test-1/blob/main/results_2024-04-04T19-13-35.456648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2551872186742336,
"acc_stderr": 0.030720839714441703,
"acc_norm": 0.2564006611007971,
"acc_norm_stderr": 0.03153950595899025,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4054758035159032,
"mc2_stderr": 0.014782479763462388
},
"harness|arc:challenge|25": {
"acc": 0.26791808873720135,
"acc_stderr": 0.012942030195136435,
"acc_norm": 0.31313993174061433,
"acc_norm_stderr": 0.013552671543623497
},
"harness|hellaswag|10": {
"acc": 0.3657637920732922,
"acc_stderr": 0.004806593424942259,
"acc_norm": 0.4523999203345947,
"acc_norm_stderr": 0.00496711857590529
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051996,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051996
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237653,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234085,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234085
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400158,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400158
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.0234009289183105,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.0234009289183105
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926764,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926764
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21076233183856502,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.21076233183856502,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824848,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824848
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02723601394619668,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02723601394619668
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25287356321839083,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.25287356321839083,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.021393961404363847,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.021393961404363847
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961452,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.023287685312334813,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.023287685312334813
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02378858355165853,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02378858355165853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.02624492034984301,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.02624492034984301
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478949,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478949
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335307,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02892058322067561,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02892058322067561
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.03096590312357305,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.03096590312357305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.4054758035159032,
"mc2_stderr": 0.014782479763462388
},
"harness|winogrande|5": {
"acc": 0.5706393054459353,
"acc_stderr": 0.013911537499969165
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_rte_do_tense_marker | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 882262
num_examples: 2471
- name: train
num_bytes: 760770
num_examples: 2035
download_size: 1057393
dataset_size: 1643032
---
# Dataset Card for "MULTI_VALUE_rte_do_tense_marker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
silentzone/test | ---
license: apache-2.0
---
|
BirdL/ProjectSong | ---
license: apache-2.0
---
|
Jing24/low-train1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 70620698
num_examples: 77589
download_size: 44686816
dataset_size: 70620698
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "low-train1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SauravMaheshkar/pareto-citeseer | ---
size_categories:
- 1K<n<10K
task_categories:
- graph-ml
license: cc
---
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:-------:|:----------:|
| 3,327 | 9,104 | 3,703 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` |
nq_open | ---
annotations_creators:
- expert-generated
language_creators:
- other
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|natural_questions
task_categories:
- question-answering
task_ids:
- open-domain-qa
pretty_name: NQ-Open
dataset_info:
config_name: nq_open
features:
- name: question
dtype: string
- name: answer
sequence: string
splits:
- name: train
num_bytes: 6651236
num_examples: 87925
- name: validation
num_bytes: 313829
num_examples: 3610
download_size: 4678245
dataset_size: 6965065
configs:
- config_name: nq_open
data_files:
- split: train
path: nq_open/train-*
- split: validation
path: nq_open/validation-*
default: true
---
# Dataset Card for nq_open
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://efficientqa.github.io/
- **Repository:** https://github.com/google-research-datasets/natural-questions/tree/master/nq_open
- **Paper:** https://www.aclweb.org/anthology/P19-1612.pdf
- **Leaderboard:** https://ai.google.com/research/NaturalQuestions/efficientqa
- **Point of Contact:** [Mailing List](efficientqa@googlegroups.com)
### Dataset Summary
The NQ-Open task, introduced by Lee et.al. 2019,
is an open domain question answering benchmark that is derived from Natural Questions.
The goal is to predict an English answer string for an input English question.
All questions can be answered using the contents of English Wikipedia.
### Supported Tasks and Leaderboards
Open Domain Question-Answering,
EfficientQA Leaderboard: https://ai.google.com/research/NaturalQuestions/efficientqa
### Languages
English (`en`)
## Dataset Structure
### Data Instances
```
{
"question": "names of the metropolitan municipalities in south africa",
"answer": [
"Mangaung Metropolitan Municipality",
"Nelson Mandela Bay Metropolitan Municipality",
"eThekwini Metropolitan Municipality",
"City of Tshwane Metropolitan Municipality",
"City of Johannesburg Metropolitan Municipality",
"Buffalo City Metropolitan Municipality",
"City of Ekurhuleni Metropolitan Municipality"
]
}
```
### Data Fields
- `question` - Input open domain question.
- `answer` - List of possible answers to the question
### Data Splits
- Train : 87925
- validation : 3610
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
Natural Questions contains question from aggregated queries to Google Search (Kwiatkowski et al., 2019). To gather an open version of this dataset, we only keep questions with short answers and discard the given evidence document. Answers with many tokens often resemble extractive snippets rather than canonical answers, so we discard answers with more than 5 tokens.
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
Evaluating on this diverse set of question-answer pairs is crucial, because all existing datasets have inherent biases that are problematic for open domain QA systems with learned retrieval.
In the Natural Questions dataset the question askers do not already know the answer. This accurately reflects a distribution of genuine information-seeking questions.
However, annotators must separately find correct answers, which requires assistance from automatic tools and can introduce a moderate bias towards results from the tool.
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
All of the Natural Questions data is released under the
[CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/) license.
### Citation Information
```
@article{doi:10.1162/tacl\_a\_00276,
author = {Kwiatkowski, Tom and Palomaki, Jennimaria and Redfield, Olivia and Collins, Michael and Parikh, Ankur and Alberti, Chris and Epstein, Danielle and Polosukhin, Illia and Devlin, Jacob and Lee, Kenton and Toutanova, Kristina and Jones, Llion and Kelcey, Matthew and Chang, Ming-Wei and Dai, Andrew M. and Uszkoreit, Jakob and Le, Quoc and Petrov, Slav},
title = {Natural Questions: A Benchmark for Question Answering Research},
journal = {Transactions of the Association for Computational Linguistics},
volume = {7},
number = {},
pages = {453-466},
year = {2019},
doi = {10.1162/tacl\_a\_00276},
URL = {
https://doi.org/10.1162/tacl_a_00276
},
eprint = {
https://doi.org/10.1162/tacl_a_00276
},
abstract = { We present the Natural Questions corpus, a question answering data set. Questions consist of real anonymized, aggregated queries issued to the Google search engine. An annotator is presented with a question along with a Wikipedia page from the top 5 search results, and annotates a long answer (typically a paragraph) and a short answer (one or more entities) if present on the page, or marks null if no long/short answer is present. The public release consists of 307,373 training examples with single annotations; 7,830 examples with 5-way annotations for development data; and a further 7,842 examples with 5-way annotated sequestered as test data. We present experiments validating quality of the data. We also describe analysis of 25-way annotations on 302 examples, giving insights into human variability on the annotation task. We introduce robust metrics for the purposes of evaluating question answering systems; demonstrate high human upper bounds on these metrics; and establish baseline results using competitive methods drawn from related literature. }
}
@inproceedings{lee-etal-2019-latent,
title = "Latent Retrieval for Weakly Supervised Open Domain Question Answering",
author = "Lee, Kenton and
Chang, Ming-Wei and
Toutanova, Kristina",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1612",
doi = "10.18653/v1/P19-1612",
pages = "6086--6096",
abstract = "Recent work on open domain question answering (QA) assumes strong supervision of the supporting evidence and/or assumes a blackbox information retrieval (IR) system to retrieve evidence candidates. We argue that both are suboptimal, since gold evidence is not always available, and QA is fundamentally different from IR. We show for the first time that it is possible to jointly learn the retriever and reader from question-answer string pairs and without any IR system. In this setting, evidence retrieval from all of Wikipedia is treated as a latent variable. Since this is impractical to learn from scratch, we pre-train the retriever with an Inverse Cloze Task. We evaluate on open versions of five QA datasets. On datasets where the questioner already knows the answer, a traditional IR system such as BM25 is sufficient. On datasets where a user is genuinely seeking an answer, we show that learned retrieval is crucial, outperforming BM25 by up to 19 points in exact match.",
}
```
### Contributions
Thanks to [@Nilanshrajput](https://github.com/Nilanshrajput) for adding this dataset. |
liuyanchen1015/MULTI_VALUE_mrpc_it_dobj | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 16279
num_examples: 54
- name: train
num_bytes: 35723
num_examples: 121
- name: validation
num_bytes: 2358
num_examples: 8
download_size: 48398
dataset_size: 54360
---
# Dataset Card for "MULTI_VALUE_mrpc_it_dobj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-v0.1](https://huggingface.co/Undi95/Mistral-11B-v0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T00:55:47.571163](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1/blob/main/results_2023-12-30T00-55-47.571163.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6300139074610193,\n\
\ \"acc_stderr\": 0.03239200090048791,\n \"acc_norm\": 0.6378790325146357,\n\
\ \"acc_norm_stderr\": 0.03306276365916844,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.4066832234739293,\n\
\ \"mc2_stderr\": 0.014223545486867587\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348902,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6128261302529376,\n\
\ \"acc_stderr\": 0.004861084534087025,\n \"acc_norm\": 0.8116908982274448,\n\
\ \"acc_norm_stderr\": 0.0039015979142464933\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266854,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266854\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30502793296089387,\n\
\ \"acc_stderr\": 0.015398723510916715,\n \"acc_norm\": 0.30502793296089387,\n\
\ \"acc_norm_stderr\": 0.015398723510916715\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4322033898305085,\n \"acc_stderr\": 0.012652297777114968,\n\
\ \"acc_norm\": 0.4322033898305085,\n \"acc_norm_stderr\": 0.012652297777114968\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545436,\n \"\
acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545436\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.4066832234739293,\n\
\ \"mc2_stderr\": 0.014223545486867587\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.266868840030326,\n \
\ \"acc_stderr\": 0.012183780551887955\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|arc:challenge|25_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|gsm8k|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hellaswag|10_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T00-55-47.571163.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- '**/details_harness|winogrande|5_2023-12-30T00-55-47.571163.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T00-55-47.571163.parquet'
- config_name: results
data_files:
- split: 2023_12_30T00_55_47.571163
path:
- results_2023-12-30T00-55-47.571163.parquet
- split: latest
path:
- results_2023-12-30T00-55-47.571163.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-v0.1](https://huggingface.co/Undi95/Mistral-11B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T00:55:47.571163](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1/blob/main/results_2023-12-30T00-55-47.571163.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6300139074610193,
"acc_stderr": 0.03239200090048791,
"acc_norm": 0.6378790325146357,
"acc_norm_stderr": 0.03306276365916844,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.4066832234739293,
"mc2_stderr": 0.014223545486867587
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348902,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.6128261302529376,
"acc_stderr": 0.004861084534087025,
"acc_norm": 0.8116908982274448,
"acc_norm_stderr": 0.0039015979142464933
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266854,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266854
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30502793296089387,
"acc_stderr": 0.015398723510916715,
"acc_norm": 0.30502793296089387,
"acc_norm_stderr": 0.015398723510916715
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545436,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545436
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.4066832234739293,
"mc2_stderr": 0.014223545486867587
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.266868840030326,
"acc_stderr": 0.012183780551887955
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ppbrown/faeryqueen | ---
license: creativeml-openrail-m
---

This contains all the files used to create my ["faeryqueen" LoRA](https://civitai.com/models/381785/faeryqueen-sd) with OneTrainer
|
AlexDom/TSA | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: 'null'
- name: metadata
struct:
- name: category
dtype: int64
- name: status
dtype: string
- name: event_timestamp
dtype: 'null'
- name: metrics
dtype: 'null'
splits:
- name: train
num_bytes: 1205760
num_examples: 5001
download_size: 447577
dataset_size: 1205760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aagoluoglu/AI_HW3_detection_results | ---
dataset_info:
features:
- name: video_id
dtype: string
- name: frame_num
dtype: int64
- name: timestamp
dtype: float64
- name: detected_obj_id
dtype: int64
- name: detected_obj_class
dtype: int64
- name: confidence
dtype: float32
- name: bbox_info
sequence: float32
splits:
- name: train
num_bytes: 120445
num_examples: 1111
download_size: 46643
dataset_size: 120445
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_analytic_superlative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 225
num_examples: 1
- name: train
num_bytes: 5677
num_examples: 17
download_size: 7814
dataset_size: 5902
---
# Dataset Card for "MULTI_VALUE_wnli_analytic_superlative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
projecte-aina/raco_forums | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- ca
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: Racó Forums
task_categories:
- fill-mask
task_ids: []
---
# Dataset Card for Racó Forums Corpus
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Point of Contact:** [langtech@bsc.es](langtech@bsc.es)
### Dataset Summary
The Racó Forums Corpus is a 19-million-sentence corpus of Catalan user-generated text built from the forums of [Racó Català](https://www.racocatala.cat/forums).
Since the existing available corpora in Catalan lacked conversational data, we searched for a major source of such data for Catalan, and we found Racó Català, a popular multitopic online forum. We obtained a database dump and we transformed all the threads so that we obtained documents that traversed all the existing paths from the root (initial comment) to the leaves (last comment with no reply). In other words, if T is a tree such that T = {A,B,C,D} and the first comment is A that is replied by B and C independently, and, then, C is replied by D, we obtain two different documents A,B and A,C,D in the fairseq language modeling format.
This work is licensed under a [Creative Commons Attribution Non-commercial 4.0 International License](https://creativecommons.org/licenses/by-nc/4.0/).
### Supported Tasks and Leaderboards
This corpus is mainly intended to pretrain language models and word representations.
### Languages
The dataset is in Catalan (`ca-ES`).
## Dataset Structure
The sentences are ordered to preserve the forum structure of comments and answers. T is a tree such that T = {A,B,C,D} and the first comment is A that is replied by B and C independently, and, then, C is replied by D, we obtain two different documents A,B and A,C,D in the fairseq language modeling format.
### Data Instances
```
Ni la Paloma, ni la Razz, ni Bikini, ni res: la cafeteria Slàvia, a Les borges Blanques. Quin concertàs el d'ahir de Pomada!!! Fuà!!! va ser tan tan tan tan tan tan tan bo!!! Flipant!!! Irrepetible!!
És cert, l'Slàvia mola màxim.
```
### Data Splits
The dataset contains two splits: `train` and `valid`.
## Dataset Creation
### Curation Rationale
We created this corpus to contribute to the development of language models in Catalan, a low-resource language. The data was structured to preserve the dialogue structure of forums.
### Source Data
#### Initial Data Collection and Normalization
The data was structured and anonymized by the BSC.
#### Who are the source language producers?
The data was provided by Racó Català.
### Annotations
The dataset is unannotated.
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
The data was annonymised to remove user names and emails, which were changed to random Catalan names. The mentions to the chat itself have also been changed.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this corpus contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
We are aware that, since the data comes from user-generated forums, this will contain biases, hate speech and toxic content. We have not applied any steps to reduce their impact.
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@bsc.es).
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/ca/inici/index.html#googtrans(ca|en) within the framework of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina).
### Licensing Information
This work is licensed under a [Creative Commons Attribution Non-commercial 4.0 International License](https://creativecommons.org/licenses/by-nc/4.0/).
### Citation Information
```
```
### Contributions
Thanks to Racó Català for sharing their data.
|
Densu341/Fresh-rotten-fruit | ---
license: openrail
---
|
adamjweintraut/bart-finetuned-kwsylgen-64_2024-04-12_run | ---
dataset_info:
features:
- name: id
dtype: string
- name: orig
dtype: string
- name: predicted
dtype: string
- name: label
dtype: string
- name: rougeL_min_precision
dtype: float64
- name: rougeL_min_recall
dtype: float64
- name: rougeL_min_fmeasure
dtype: float64
- name: rougeL_median_precision
dtype: float64
- name: rougeL_median_recall
dtype: float64
- name: rougeL_median_fmeasure
dtype: float64
- name: rougeL_max_precision
dtype: float64
- name: rougeL_max_recall
dtype: float64
- name: rougeL_max_fmeasure
dtype: float64
- name: predicted_label_sim
dtype: float32
- name: predicted_syls
dtype: int64
- name: label_syls
dtype: int64
- name: syl_error
dtype: float64
splits:
- name: train
num_bytes: 7204
num_examples: 15
download_size: 13186
dataset_size: 7204
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dongyoung4091/hh-generated_flan_t5_large_with_features2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: 'biased:'
dtype: int64
- name: easy-to-understand
dtype: int64
- name: enough-detail
dtype: int64
- name: factuality
dtype: int64
- name: fail-to-consider-context
dtype: int64
- name: fail-to-consider-individual-preferences
dtype: int64
- name: helpfulness
dtype: int64
- name: intent
dtype: int64
- name: readability
dtype: int64
- name: relevance
dtype: int64
- name: repetetive
dtype: int64
- name: specificity
dtype: int64
- name: too-long
dtype: int64
splits:
- name: train
num_bytes: 395323
num_examples: 1600
download_size: 76218
dataset_size: 395323
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
VladS159/common_voice_16_1_romanian_speech_synthesis | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2155457630.946
num_examples: 34703
- name: test
num_bytes: 279470458.146
num_examples: 4438
download_size: 2366238354
dataset_size: 2434928089.092
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
FanChen0116/bus_few4_80x | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 1087354
num_examples: 5600
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 0
dataset_size: 1164872
---
# Dataset Card for "bus_few4_80x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
comet-team/coco-500 | ---
dataset_info:
features:
- name: row-id
dtype: int32
- name: ID
dtype: int32
- name: Image
dtype: image
- name: Score
dtype: float32
- name: Confidence
dtype: float32
- name: Filename
dtype: string
- name: Category 5
dtype: string
- name: Category 10
dtype: string
- name: Image--metadata
dtype: large_string
splits:
- name: train
num_bytes: 247000470.0
num_examples: 500
download_size: 246448541
dataset_size: 247000470.0
---
# Dataset Card for "coco-500"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
artemkramov/coreference-dataset-ua | ---
task_categories:
- token-classification
language:
- uk
pretty_name: 'Silver Ukrainian Coreference Dataset '
tags:
- coreference-resolution
- coreference
- anaphora
size_categories:
- 10K<n<100K
---
# Silver Ukrainian Coreference Dataset
## Dataset Description
### Dataset Summary
A silver coreference resolution dataset for the Ukrainian language. The dataset was generated automatically with the usage of the word alignment method from the following English dataset: https://github.com/d5555/Coreference-dataset.
The word alignment method was implemented by Andrii Kursin (aqrsn@ukr.net).
### Languages
- Ukrainian
## Dataset Structure
### Data Fields
Each sample of the dataset consists of the following fields:
- **doc_key** - document identifier.
- **clusters** - list of clusters, where each cluster consists of the list of mentions. Each mention is represented as a list of two indices: the first index denotes the first word of the mention, the second index denotes the last word of the mention.
- **sentences** - list of sentences where each sentence is represented as a list of words.
- **tokens** - list of words.
- **speakers** - list of speakers which is currently filled with dummy input.
### Data Splits
The dataset is divided into two parts:
- training set;
- validation set.
A test set is absent as far as the dataset is generated automatically.
## Dataset Creation
### Source Data
The dataset was created from the following dataset: https://github.com/d5555/Coreference-dataset.
### Contributions
The code for the translation of samples with further alignment was created by Andrii Kursin (aqrsn@ukr.net). The dataset was generated by Artem Kramov (https://www.linkedin.com/in/artem-kramov-0b3731100/). |
qkrwnstj/impressionism-journal | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 70265770.0
num_examples: 20
download_size: 70270244
dataset_size: 70265770.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "impressionism-journal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser | ---
pretty_name: Evaluation run of cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser](https://huggingface.co/cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T21:09:52.023664](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser/blob/main/results_2024-02-01T21-09-52.023664.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26487177878693896,\n\
\ \"acc_stderr\": 0.031083173918083885,\n \"acc_norm\": 0.26611351733798344,\n\
\ \"acc_norm_stderr\": 0.0318546335977903,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023507,\n \"mc2\": 0.36332154287207935,\n\
\ \"mc2_stderr\": 0.014014442507659016\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3097269624573379,\n \"acc_stderr\": 0.013512058415238361,\n\
\ \"acc_norm\": 0.33361774744027306,\n \"acc_norm_stderr\": 0.013778687054176538\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45140410276837284,\n\
\ \"acc_stderr\": 0.004966158142645414,\n \"acc_norm\": 0.5853415654252141,\n\
\ \"acc_norm_stderr\": 0.004916561213591292\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051975,\n\
\ \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051975\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.18055555555555555,\n\
\ \"acc_stderr\": 0.03216600808802269,\n \"acc_norm\": 0.18055555555555555,\n\
\ \"acc_norm_stderr\": 0.03216600808802269\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238153,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238153\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462833,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916646,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27339449541284405,\n \"acc_stderr\": 0.01910929984609828,\n \"\
acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.01910929984609828\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658335,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824849,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824849\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23371647509578544,\n\
\ \"acc_stderr\": 0.015133383278988825,\n \"acc_norm\": 0.23371647509578544,\n\
\ \"acc_norm_stderr\": 0.015133383278988825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596729,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596729\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.0251237392268724,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.0251237392268724\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2607561929595828,\n\
\ \"acc_stderr\": 0.011213471559602325,\n \"acc_norm\": 0.2607561929595828,\n\
\ \"acc_norm_stderr\": 0.011213471559602325\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28594771241830064,\n \"acc_stderr\": 0.01828048507295467,\n \
\ \"acc_norm\": 0.28594771241830064,\n \"acc_norm_stderr\": 0.01828048507295467\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.16326530612244897,\n \"acc_stderr\": 0.023661699177098615,\n\
\ \"acc_norm\": 0.16326530612244897,\n \"acc_norm_stderr\": 0.023661699177098615\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023507,\n \"mc2\": 0.36332154287207935,\n\
\ \"mc2_stderr\": 0.014014442507659016\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873836\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \
\ \"acc_stderr\": 0.003106901266499662\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|arc:challenge|25_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|gsm8k|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hellaswag|10_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T21-09-52.023664.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- '**/details_harness|winogrande|5_2024-02-01T21-09-52.023664.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T21-09-52.023664.parquet'
- config_name: results
data_files:
- split: 2024_02_01T21_09_52.023664
path:
- results_2024-02-01T21-09-52.023664.parquet
- split: latest
path:
- results_2024-02-01T21-09-52.023664.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser](https://huggingface.co/cognitivecomputations/TinyDolphin-2.8.2-1.1b-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T21:09:52.023664](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__TinyDolphin-2.8.2-1.1b-laser/blob/main/results_2024-02-01T21-09-52.023664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26487177878693896,
"acc_stderr": 0.031083173918083885,
"acc_norm": 0.26611351733798344,
"acc_norm_stderr": 0.0318546335977903,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023507,
"mc2": 0.36332154287207935,
"mc2_stderr": 0.014014442507659016
},
"harness|arc:challenge|25": {
"acc": 0.3097269624573379,
"acc_stderr": 0.013512058415238361,
"acc_norm": 0.33361774744027306,
"acc_norm_stderr": 0.013778687054176538
},
"harness|hellaswag|10": {
"acc": 0.45140410276837284,
"acc_stderr": 0.004966158142645414,
"acc_norm": 0.5853415654252141,
"acc_norm_stderr": 0.004916561213591292
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051975,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051975
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.03216600808802269,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.03216600808802269
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238153,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238153
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708614,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708614
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462833,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.01910929984609828,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.01910929984609828
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658335,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824849,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824849
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23371647509578544,
"acc_stderr": 0.015133383278988825,
"acc_norm": 0.23371647509578544,
"acc_norm_stderr": 0.015133383278988825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.0251237392268724,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.0251237392268724
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2607561929595828,
"acc_stderr": 0.011213471559602325,
"acc_norm": 0.2607561929595828,
"acc_norm_stderr": 0.011213471559602325
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28594771241830064,
"acc_stderr": 0.01828048507295467,
"acc_norm": 0.28594771241830064,
"acc_norm_stderr": 0.01828048507295467
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.16326530612244897,
"acc_stderr": 0.023661699177098615,
"acc_norm": 0.16326530612244897,
"acc_norm_stderr": 0.023661699177098615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023507,
"mc2": 0.36332154287207935,
"mc2_stderr": 0.014014442507659016
},
"harness|winogrande|5": {
"acc": 0.601420678768745,
"acc_stderr": 0.013760357176873836
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499662
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
myorder/products-images-105k | ---
license: cc-by-sa-3.0
---
|
RachidAb02/Finance-Accounting | ---
license: mit
task_categories:
- question-answering
language:
- aa
tags:
- finance
pretty_name: Finance-Accounting
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
niaodtianatng/asdfghjkl | ---
license: apache-2.0
---
|
CyberHarem/akafuyu_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akafuyu/アカフユ/赤冬 (Arknights)
This is the dataset of akafuyu/アカフユ/赤冬 (Arknights), containing 43 images and their tags.
The core tags of this character are `long_hair, breasts, ponytail, yellow_eyes, multicolored_hair, streaked_hair, hair_between_eyes, red_hair, purple_hair, large_breasts, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 77.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akafuyu_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 43 | 63.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akafuyu_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 108 | 132.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akafuyu_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akafuyu_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_shirt, looking_at_viewer, solo, blue_hair, crop_top, holding_sword, midriff, mole_under_mouth, simple_background, smile, white_background, katana, navel, sleeveless_shirt, upper_body, bare_shoulders, black_gloves, blush, fingerless_gloves, shoulder_armor, single_glove, stomach, very_long_hair |
| 1 | 5 |  |  |  |  |  | 1girl, crop_top, holding_sword, katana, shoulder_armor, solo, black_shirt, looking_at_viewer, midriff, blue_hair, navel, very_long_hair, black_gloves, grey_hair, japanese_armor, sheathed, single_glove, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_shirt | looking_at_viewer | solo | blue_hair | crop_top | holding_sword | midriff | mole_under_mouth | simple_background | smile | white_background | katana | navel | sleeveless_shirt | upper_body | bare_shoulders | black_gloves | blush | fingerless_gloves | shoulder_armor | single_glove | stomach | very_long_hair | grey_hair | japanese_armor | sheathed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------------|:-------|:------------|:-----------|:----------------|:----------|:-------------------|:--------------------|:--------|:-------------------|:---------|:--------|:-------------------|:-------------|:-----------------|:---------------|:--------|:--------------------|:-----------------|:---------------|:----------|:-----------------|:------------|:-----------------|:-----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | | X | X | | | | X | | | X | X | | X | X | X | X |
|
ShoukanLabs/OpenNiji-32238_65000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: url
dtype: string
- name: prompt
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 52962622682.488
num_examples: 32759
download_size: 18174175565
dataset_size: 52962622682.488
---
# Dataset Card for "OpenNiji-32238_65000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangwang825/sst2-textbugger-1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: augment
dtype: string
splits:
- name: train
num_bytes: 1770881
num_examples: 13840
- name: validation
num_bytes: 110096
num_examples: 872
- name: test
num_bytes: 226340
num_examples: 1821
download_size: 916607
dataset_size: 2107317
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Vinnyyw/Ponchovoz | ---
license: openrail
---
|
open-llm-leaderboard/details_johnsnowlabs__PhigRange-DPO | ---
pretty_name: Evaluation run of johnsnowlabs/PhigRange-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [johnsnowlabs/PhigRange-DPO](https://huggingface.co/johnsnowlabs/PhigRange-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_johnsnowlabs__PhigRange-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T23:26:36.639397](https://huggingface.co/datasets/open-llm-leaderboard/details_johnsnowlabs__PhigRange-DPO/blob/main/results_2024-04-09T23-26-36.639397.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25440582121622896,\n\
\ \"acc_stderr\": 0.030864421919777126,\n \"acc_norm\": 0.2552550665065153,\n\
\ \"acc_norm_stderr\": 0.03168576752429294,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.4797537392660647,\n\
\ \"mc2_stderr\": 0.016660324054891092\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2150170648464164,\n \"acc_stderr\": 0.012005717634133611,\n\
\ \"acc_norm\": 0.257679180887372,\n \"acc_norm_stderr\": 0.012780770562768422\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25562636924915355,\n\
\ \"acc_stderr\": 0.004353212146198441,\n \"acc_norm\": 0.2570205138418642,\n\
\ \"acc_norm_stderr\": 0.004360977256058753\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.02560423347089909,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.02560423347089909\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.16170212765957448,\n \"acc_stderr\": 0.024068505289695313,\n\
\ \"acc_norm\": 0.16170212765957448,\n \"acc_norm_stderr\": 0.024068505289695313\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.17543859649122806,\n\
\ \"acc_stderr\": 0.0357795481394837,\n \"acc_norm\": 0.17543859649122806,\n\
\ \"acc_norm_stderr\": 0.0357795481394837\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185555,\n\
\ \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185555\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398203,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398203\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2870967741935484,\n\
\ \"acc_stderr\": 0.025736542745594525,\n \"acc_norm\": 0.2870967741935484,\n\
\ \"acc_norm_stderr\": 0.025736542745594525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30808080808080807,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.30808080808080807,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29533678756476683,\n \"acc_stderr\": 0.032922966391551386,\n\
\ \"acc_norm\": 0.29533678756476683,\n \"acc_norm_stderr\": 0.032922966391551386\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.023000628243687964,\n\
\ \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.023000628243687964\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\"\
: 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.31862745098039214,\n\
\ \"acc_stderr\": 0.032702871814820816,\n \"acc_norm\": 0.31862745098039214,\n\
\ \"acc_norm_stderr\": 0.032702871814820816\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036406,\n\
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036406\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.16143497757847533,\n\
\ \"acc_stderr\": 0.024693957899128472,\n \"acc_norm\": 0.16143497757847533,\n\
\ \"acc_norm_stderr\": 0.024693957899128472\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952685,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952685\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.027236013946196676,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.027236013946196676\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653696,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653696\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395967,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395967\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468673,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468673\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20261437908496732,\n \"acc_stderr\": 0.02301544687798567,\n\
\ \"acc_norm\": 0.20261437908496732,\n \"acc_norm_stderr\": 0.02301544687798567\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24437299035369775,\n\
\ \"acc_stderr\": 0.024406162094668903,\n \"acc_norm\": 0.24437299035369775,\n\
\ \"acc_norm_stderr\": 0.024406162094668903\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2522816166883963,\n\
\ \"acc_stderr\": 0.011092789056875236,\n \"acc_norm\": 0.2522816166883963,\n\
\ \"acc_norm_stderr\": 0.011092789056875236\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23161764705882354,\n \"acc_stderr\": 0.025626533803777562,\n\
\ \"acc_norm\": 0.23161764705882354,\n \"acc_norm_stderr\": 0.025626533803777562\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2612244897959184,\n \"acc_stderr\": 0.02812342933514279,\n\
\ \"acc_norm\": 0.2612244897959184,\n \"acc_norm_stderr\": 0.02812342933514279\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.030769444967296014,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.030769444967296014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987251,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987251\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.4797537392660647,\n\
\ \"mc2_stderr\": 0.016660324054891092\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616445\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/johnsnowlabs/PhigRange-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-26-36.639397.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-26-36.639397.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- '**/details_harness|winogrande|5_2024-04-09T23-26-36.639397.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T23-26-36.639397.parquet'
- config_name: results
data_files:
- split: 2024_04_09T23_26_36.639397
path:
- results_2024-04-09T23-26-36.639397.parquet
- split: latest
path:
- results_2024-04-09T23-26-36.639397.parquet
---
# Dataset Card for Evaluation run of johnsnowlabs/PhigRange-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [johnsnowlabs/PhigRange-DPO](https://huggingface.co/johnsnowlabs/PhigRange-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_johnsnowlabs__PhigRange-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T23:26:36.639397](https://huggingface.co/datasets/open-llm-leaderboard/details_johnsnowlabs__PhigRange-DPO/blob/main/results_2024-04-09T23-26-36.639397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25440582121622896,
"acc_stderr": 0.030864421919777126,
"acc_norm": 0.2552550665065153,
"acc_norm_stderr": 0.03168576752429294,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.4797537392660647,
"mc2_stderr": 0.016660324054891092
},
"harness|arc:challenge|25": {
"acc": 0.2150170648464164,
"acc_stderr": 0.012005717634133611,
"acc_norm": 0.257679180887372,
"acc_norm_stderr": 0.012780770562768422
},
"harness|hellaswag|10": {
"acc": 0.25562636924915355,
"acc_stderr": 0.004353212146198441,
"acc_norm": 0.2570205138418642,
"acc_norm_stderr": 0.004360977256058753
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.02560423347089909,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.02560423347089909
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.16170212765957448,
"acc_stderr": 0.024068505289695313,
"acc_norm": 0.16170212765957448,
"acc_norm_stderr": 0.024068505289695313
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.0357795481394837,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.0357795481394837
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185555,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185555
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.02345603738398203,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.02345603738398203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594525,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30808080808080807,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.30808080808080807,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29533678756476683,
"acc_stderr": 0.032922966391551386,
"acc_norm": 0.29533678756476683,
"acc_norm_stderr": 0.032922966391551386
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28974358974358977,
"acc_stderr": 0.023000628243687964,
"acc_norm": 0.28974358974358977,
"acc_norm_stderr": 0.023000628243687964
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.31862745098039214,
"acc_stderr": 0.032702871814820816,
"acc_norm": 0.31862745098039214,
"acc_norm_stderr": 0.032702871814820816
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036406,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036406
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.16143497757847533,
"acc_stderr": 0.024693957899128472,
"acc_norm": 0.16143497757847533,
"acc_norm_stderr": 0.024693957899128472
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952685,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952685
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.027236013946196676,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.027236013946196676
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653696,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653696
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395967,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395967
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468673,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468673
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20261437908496732,
"acc_stderr": 0.02301544687798567,
"acc_norm": 0.20261437908496732,
"acc_norm_stderr": 0.02301544687798567
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24437299035369775,
"acc_stderr": 0.024406162094668903,
"acc_norm": 0.24437299035369775,
"acc_norm_stderr": 0.024406162094668903
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2522816166883963,
"acc_stderr": 0.011092789056875236,
"acc_norm": 0.2522816166883963,
"acc_norm_stderr": 0.011092789056875236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23161764705882354,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.23161764705882354,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2612244897959184,
"acc_stderr": 0.02812342933514279,
"acc_norm": 0.2612244897959184,
"acc_norm_stderr": 0.02812342933514279
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296014,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987251,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987251
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.4797537392660647,
"mc2_stderr": 0.016660324054891092
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616445
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lionelchg/dolly15k_special_tokens | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 23658852.352275
num_examples: 14260
- name: test
num_bytes: 1245988.6477250017
num_examples: 751
download_size: 15124192
dataset_size: 24904841.0
---
# Dataset Card for "dolly15k_special_tokens"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
helloelwin/harmful-weak-labels | ---
dataset_info:
- config_name: default
features:
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: answer
dtype: string
- name: acc
dtype: float64
splits:
- name: train
num_bytes: 17965022
num_examples: 10619
download_size: 8557596
dataset_size: 17965022
- config_name: gemma
features:
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: answer
dtype: string
- name: acc
dtype: float64
splits:
- name: train
num_bytes: 46220058
num_examples: 21237
download_size: 21445744
dataset_size: 46220058
- config_name: gemma-e=2
features:
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: answer
dtype: string
- name: acc
dtype: float64
splits:
- name: train
num_bytes: 21836006
num_examples: 10000
download_size: 10187582
dataset_size: 21836006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: gemma
data_files:
- split: train
path: gemma/train-*
- config_name: gemma-e=2
data_files:
- split: train
path: gemma-e=2/train-*
---
|
naphatmanu/index-loft-modern | ---
license: mit
---
|
CyberHarem/soline_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of soline/ソリン/索林/솔린 (Nikke: Goddess of Victory)
This is the dataset of soline/ソリン/索林/솔린 (Nikke: Goddess of Victory), containing 31 images and their tags.
The core tags of this character are `long_hair, bangs, hair_ornament, hat, hairclip, bow, very_long_hair, red_eyes, black_headwear, grey_hair, blue_bow, white_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 52.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soline_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 25.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soline_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 74 | 56.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soline_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 43.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soline_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 74 | 85.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soline_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/soline_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 31 |  |  |  |  |  | 1girl, looking_at_viewer, solo, closed_mouth, black_gloves, black_jacket, blush, long_sleeves, pleated_skirt, white_shirt, armband, black_pantyhose, black_skirt, bowtie, white_background, open_clothes, standing, holding_gun, shoes, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | closed_mouth | black_gloves | black_jacket | blush | long_sleeves | pleated_skirt | white_shirt | armband | black_pantyhose | black_skirt | bowtie | white_background | open_clothes | standing | holding_gun | shoes | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:---------------|:---------------|:--------|:---------------|:----------------|:--------------|:----------|:------------------|:--------------|:---------|:-------------------|:---------------|:-----------|:--------------|:--------|:--------------------|
| 0 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Tiax/demo | ---
license: apache-2.0
---
|
Alexisnlxoekdk/MCKevinV2 | ---
license: openrail
---
|
liuyanchen1015/VALUE_wikitext2_negative_concord | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: test
num_bytes: 165495
num_examples: 178
- name: train
num_bytes: 1546197
num_examples: 1691
- name: validation
num_bytes: 152679
num_examples: 173
download_size: 1160295
dataset_size: 1864371
---
# Dataset Card for "VALUE_wikitext2_negative_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713160261 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11083
num_examples: 27
download_size: 15160
dataset_size: 11083
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713160261"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edg3/reuters_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
tyzhu/wikitext-103-raw-v1-para-permute-5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3279005674
num_examples: 10808095
- name: validation
num_bytes: 1159288
num_examples: 3760
- name: test
num_bytes: 1305088
num_examples: 4358
download_size: 1887425635
dataset_size: 3281470050
---
# Dataset Card for "wikitext-103-raw-v1-para-permute-5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Viet-Medical/medical_bench_raw | ---
dataset_info:
features:
- name: questions
dtype: string
- name: a
dtype: string
- name: b
dtype: string
- name: c
dtype: string
- name: d
dtype: string
- name: source_link
dtype: string
- name: correct_answer
dtype: string
splits:
- name: train
num_bytes: 11883977
num_examples: 31272
download_size: 1203689
dataset_size: 11883977
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AgentWaller/openassistant-guanaco-en-translated | ---
license: apache-2.0
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: int64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 32780237
num_examples: 29329
- name: validation
num_bytes: 1724911
num_examples: 1536
download_size: 13607387
dataset_size: 34505148
---
|
juancopi81/orca-math-word-problems-190038_200035 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 14487597
num_examples: 9997
download_size: 6518354
dataset_size: 14487597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kizik/Pythonista | ---
license: unknown
---
|
guiifive/fivevoz2 | ---
license: openrail
---
|
elissilva/sheldoncooper | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713199995 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 17288
num_examples: 48
download_size: 17257
dataset_size: 17288
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713199995"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pinzhenchen/alpaca-cleaned-fr | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
- question-answering
language:
- fr
tags:
- instruction tuning
size_categories:
- 10K<n<100K
---
### Data Description
This HF data repository contains the French Alpaca dataset used in our study of monolingual versus multilingual instruction tuning.
* [GitHub](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main)
* [Paper](https://arxiv.org/abs/2309.08958)
#### Creation
* Machine-translated from [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) into French.
#### Usage
* This data is intended to be used for French instruction tuning.
* The dataset has roughly 52K instances in the JSON format.
* Each instance has an instruction, an output, and an optional input. An example is shown below:
```
{
"instruction": "Quelles sont les trois couleurs primaires ?",
"input": "",
"output": "Les trois couleurs primaires sont le rouge, le bleu et le jaune. Ces couleurs sont appelées primaires car elles ne peuvent pas être créées en mélangeant d'autres couleurs et toutes les autres couleurs peuvent être faites en les combinant dans différentes proportions. Dans le système de couleur additif, utilisé pour la lumière, les couleurs primaires sont le rouge, le vert et le bleu (RGB)."
}
```
#### Known issues
* The machine translation process might have corrupted data containing code, cross-lingual tasks, grammatical error correction tasks, etc.
#### Citation
```
@inproceedings{chen-etal-2024-monolingual,
title="Monolingual or multilingual instruction tuning: Which makes a better {Alpaca}",
author="Pinzhen Chen and Shaoxiong Ji and Nikolay Bogoychev and Andrey Kutuzov and Barry Haddow and Kenneth Heafield",
year="2024",
booktitle = "Findings of the Association for Computational Linguistics: EACL 2024",
}
``` |
mstz/glass | ---
language:
- en
tags:
- glass
- tabular_classification
- binary_classification
- UCI
pretty_name: Glass evaluation
size_categories:
- n<1k
task_categories:
- tabular-classification
configs:
- glass
- windows
- vehicles
- containers
- tableware
- headlamps
license: cc
---
# Glass
The [Glass dataset](https://archive-beta.ics.uci.edu/dataset/42/glass+identification) from the [UCI repository](https://archive-beta.ics.uci.edu).
Classify the type of glass.
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|--------------------------|
| glass | Multiclass classification | Classify glass type. |
| windows | Binary classification | Is this windows glass? |
| vehicles | Binary classification | Is this vehicles glass? |
| containers | Binary classification | Is this containers glass?|
| tableware | Binary classification | Is this tableware glass? |
| headlamps | Binary classification | Is this headlamps glass? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/glass", "glass")["train"]
``` |
carlesoctav/miracl-corpus-id | ---
dataset_info:
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: target_embedding
sequence: float32
splits:
- name: train
num_bytes: 2764659648
num_examples: 1446315
download_size: 3251111063
dataset_size: 2764659648
---
# Dataset Card for "miracl-corpus-id"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zomehwh/tttttttt | ---
license: mit
---
|
kadirnar/deneme | ---
license: apache-2.0
---
|
larryvrh/WikiMedia-v20210402-Ja_Zh-filtered | ---
dataset_info:
features:
- name: ja
dtype: string
- name: zh
dtype: string
splits:
- name: train
num_bytes: 7517762
num_examples: 15989
download_size: 4720167
dataset_size: 7517762
---
# Dataset Card for "WikiMedia-v20210402-Ja_Zh-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shailja/Verilog_GitHub | ---
license: mit
---
---
pipeline_tag: text-generation
tags:
- code
model-index:
- name: VeriGen
results:
- task:
type: text-generation
dataset:
type:
name:
extra_gated_prompt: >-
## Model License Agreement
Please read the BigCode [OpenRAIL-M
license](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement)
agreement before accepting it.
extra_gated_fields:
I accept the above license agreement, and will use the Model complying with the set of use restrictions and sharing requirements: checkbox
---
# VeriGen
## Table of Contents
1. [Dataset Summary](##model-summary)
2. [Use](##use)
3. [Limitations](##limitations)
4. [License](##license)
5. [Citation](##citation)
## Dataset Summary
- The dataset comprises Verilog modules as entries. The entries were retrieved from the GitHub dataset on BigQuery.
- For training [models (https://huggingface.co/shailja/fine-tuned-codegen-2B-Verilog)], we filtered entries with no of characters exceeding 20000 and duplicates (exact duplicates ignoring whitespaces).
- **Paper:** [ Benchmarking Large Language Models for Automated Verilog RTL Code Generation](https://arxiv.org/abs/2212.11140)
- **Point of Contact:** [contact@shailja](mailto:shailja.thakur90@gmail.com)
- **Languages:** Verilog (Hardware Description Language)
### Data Splits
The dataset only contains a train split.
### Use
```python
# pip install datasets
from datasets import load_dataset
ds = load_dataset("shailja/Verilog_GitHub", streaming=True, split="train")
print(next(iter(ds)))
#OUTPUT:
```
### Intended Use
The dataset consists of source code from a range of GitHub repositories. As such, they can potentially include non-compilable, low-quality, and vulnerable code.
### Attribution & Other Requirements
The pretraining dataset of the model was not filtered for permissive licenses only. Nevertheless, the model can generate source code verbatim from the dataset. The code's license might require attribution and/or other specific requirements that must be respected.
# License
The dataset is licensed under the BigCode OpenRAIL-M v1 license agreement. You can find the full agreement [here](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
# Citation
```
@misc{https://doi.org/10.48550/arxiv.2212.11140,
doi = {10.48550/ARXIV.2212.11140},
url = {https://arxiv.org/abs/2212.11140},
author = {Thakur, Shailja and Ahmad, Baleegh and Fan, Zhenxing and Pearce, Hammond and Tan, Benjamin and Karri, Ramesh and Dolan-Gavitt, Brendan and Garg, Siddharth},
title = {Benchmarking Large Language Models for Automated Verilog RTL Code Generation},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
``` |
joey234/mmlu-conceptual_physics-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 4993
num_examples: 5
- name: test
num_bytes: 438778
num_examples: 235
download_size: 13083
dataset_size: 443771
---
# Dataset Card for "mmlu-conceptual_physics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
llm-aes/hanna | ---
dataset_info:
features:
- name: Story_ID
dtype: int64
- name: Prompt
dtype: string
- name: Human
dtype: string
- name: Story
dtype: string
- name: Model
dtype: string
- name: Relevance
dtype: int64
- name: Coherence
dtype: int64
- name: Empathy
dtype: int64
- name: Surprise
dtype: int64
- name: Engagement
dtype: int64
- name: Complexity
dtype: int64
- name: Worker_ID
dtype: string
- name: Assignment_ID
dtype: string
- name: Work_time_in_seconds
dtype: float64
- name: Name
dtype: string
splits:
- name: train
num_bytes: 13401106
num_examples: 3168
download_size: 1721485
dataset_size: 13401106
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mihaien/my-full-dataset | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 533023663.805
num_examples: 6455
download_size: 561910210
dataset_size: 533023663.805
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceGECLM/wikipedia_urls | ---
dataset_info:
features:
- name: url
dtype: string
- name: domain
dtype: string
- name: wiki_titles
sequence: string
- name: count
dtype: int64
splits:
- name: train
num_bytes: 4215693880
num_examples: 28370288
download_size: 1535550647
dataset_size: 4215693880
---
# Dataset Card for "wikipedia_urls"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arieg/spike_prime_robot_images | ---
license: mit
---
|
mmanikanta/real_and_ai | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': FAKE
'1': REAL
splits:
- name: train
num_bytes: 93714000.0
num_examples: 100000
- name: test
num_bytes: 18762200.0
num_examples: 20000
download_size: 50493942
dataset_size: 112476200.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
cyy0/BMTL | ---
license: agpl-3.0
---
|
Rayjun0525/test_dataset_01 | ---
license: mit
---
|
acozma/imagenet-1k-rand_hog | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
- name: params
struct:
- name: orientations
dtype: int64
- name: pixels_per_cell
dtype: int64
splits:
- name: train
num_bytes: 235174567045.0
num_examples: 500000
download_size: 89659059126
dataset_size: 235174567045.0
---
# Dataset Card for "imagenet-1k-rand_hog"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hotpot_qa | ---
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- found
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: HotpotQA
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- question-answering
task_ids: []
paperswithcode_id: hotpotqa
tags:
- multi-hop
dataset_info:
- config_name: distractor
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: type
dtype: string
- name: level
dtype: string
- name: supporting_facts
sequence:
- name: title
dtype: string
- name: sent_id
dtype: int32
- name: context
sequence:
- name: title
dtype: string
- name: sentences
sequence: string
splits:
- name: train
num_bytes: 552949315
num_examples: 90447
- name: validation
num_bytes: 45716111
num_examples: 7405
download_size: 612746344
dataset_size: 598665426
- config_name: fullwiki
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: type
dtype: string
- name: level
dtype: string
- name: supporting_facts
sequence:
- name: title
dtype: string
- name: sent_id
dtype: int32
- name: context
sequence:
- name: title
dtype: string
- name: sentences
sequence: string
splits:
- name: train
num_bytes: 552949315
num_examples: 90447
- name: validation
num_bytes: 46848601
num_examples: 7405
- name: test
num_bytes: 46000102
num_examples: 7405
download_size: 660094672
dataset_size: 645798018
---
# Dataset Card for "hotpot_qa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://hotpotqa.github.io/](https://hotpotqa.github.io/)
- **Repository:** https://github.com/hotpotqa/hotpot
- **Paper:** [HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering](https://arxiv.org/abs/1809.09600)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.27 GB
- **Size of the generated dataset:** 1.24 GB
- **Total amount of disk used:** 2.52 GB
### Dataset Summary
HotpotQA is a new dataset with 113k Wikipedia-based question-answer pairs with four key features: (1) the questions require finding and reasoning over multiple supporting documents to answer; (2) the questions are diverse and not constrained to any pre-existing knowledge bases or knowledge schemas; (3) we provide sentence-level supporting facts required for reasoning, allowingQA systems to reason with strong supervision and explain the predictions; (4) we offer a new type of factoid comparison questions to test QA systems’ ability to extract relevant facts and perform necessary comparison.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### distractor
- **Size of downloaded dataset files:** 612.75 MB
- **Size of the generated dataset:** 598.66 MB
- **Total amount of disk used:** 1.21 GB
An example of 'validation' looks as follows.
```
{
"answer": "This is the answer",
"context": {
"sentences": [["Sent 1"], ["Sent 21", "Sent 22"]],
"title": ["Title1", "Title 2"]
},
"id": "000001",
"level": "medium",
"question": "What is the answer?",
"supporting_facts": {
"sent_id": [0, 1, 3],
"title": ["Title of para 1", "Title of para 2", "Title of para 3"]
},
"type": "comparison"
}
```
#### fullwiki
- **Size of downloaded dataset files:** 660.10 MB
- **Size of the generated dataset:** 645.80 MB
- **Total amount of disk used:** 1.31 GB
An example of 'train' looks as follows.
```
{
"answer": "This is the answer",
"context": {
"sentences": [["Sent 1"], ["Sent 2"]],
"title": ["Title1", "Title 2"]
},
"id": "000001",
"level": "hard",
"question": "What is the answer?",
"supporting_facts": {
"sent_id": [0, 1, 3],
"title": ["Title of para 1", "Title of para 2", "Title of para 3"]
},
"type": "bridge"
}
```
### Data Fields
The data fields are the same among all splits.
#### distractor
- `id`: a `string` feature.
- `question`: a `string` feature.
- `answer`: a `string` feature.
- `type`: a `string` feature.
- `level`: a `string` feature.
- `supporting_facts`: a dictionary feature containing:
- `title`: a `string` feature.
- `sent_id`: a `int32` feature.
- `context`: a dictionary feature containing:
- `title`: a `string` feature.
- `sentences`: a `list` of `string` features.
#### fullwiki
- `id`: a `string` feature.
- `question`: a `string` feature.
- `answer`: a `string` feature.
- `type`: a `string` feature.
- `level`: a `string` feature.
- `supporting_facts`: a dictionary feature containing:
- `title`: a `string` feature.
- `sent_id`: a `int32` feature.
- `context`: a dictionary feature containing:
- `title`: a `string` feature.
- `sentences`: a `list` of `string` features.
### Data Splits
#### distractor
| |train|validation|
|----------|----:|---------:|
|distractor|90447| 7405|
#### fullwiki
| |train|validation|test|
|--------|----:|---------:|---:|
|fullwiki|90447| 7405|7405|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
HotpotQA is distributed under a [CC BY-SA 4.0 License](http://creativecommons.org/licenses/by-sa/4.0/).
### Citation Information
```
@inproceedings{yang2018hotpotqa,
title={{HotpotQA}: A Dataset for Diverse, Explainable Multi-hop Question Answering},
author={Yang, Zhilin and Qi, Peng and Zhang, Saizheng and Bengio, Yoshua and Cohen, William W. and Salakhutdinov, Ruslan and Manning, Christopher D.},
booktitle={Conference on Empirical Methods in Natural Language Processing ({EMNLP})},
year={2018}
}
```
### Contributions
Thanks to [@albertvillanova](https://github.com/albertvillanova), [@ghomasHudson](https://github.com/ghomasHudson) for adding this dataset. |
charsiu/librispeech_full_test_frame_labels | ---
dataset_info:
features:
- name: chapter_id
dtype: int64
- name: file
dtype: string
- name: frame_labels
sequence: string
- name: frame_labels_10ms
sequence: string
- name: id
dtype: string
- name: processed_file
dtype: string
- name: speaker_id
dtype: int64
- name: text
dtype: string
- name: upsampled_file
dtype: string
splits:
- name: train
num_bytes: 78239379
num_examples: 11125
download_size: 5460315
dataset_size: 78239379
---
# Dataset Card for "librispeech_full_test_frame_labels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_automerger__NeuralsirkrishnaExperiment26-7B | ---
pretty_name: Evaluation run of automerger/NeuralsirkrishnaExperiment26-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [automerger/NeuralsirkrishnaExperiment26-7B](https://huggingface.co/automerger/NeuralsirkrishnaExperiment26-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__NeuralsirkrishnaExperiment26-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T06:00:02.561751](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__NeuralsirkrishnaExperiment26-7B/blob/main/results_2024-03-13T06-00-02.561751.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6497749407583714,\n\
\ \"acc_stderr\": 0.03204087168819212,\n \"acc_norm\": 0.6490637412748819,\n\
\ \"acc_norm_stderr\": 0.03271053969931456,\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7725358601874959,\n\
\ \"mc2_stderr\": 0.013822624088036008\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403513,\n\
\ \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.012835523909473836\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7183827922724557,\n\
\ \"acc_stderr\": 0.004488684397979498,\n \"acc_norm\": 0.8913563035251942,\n\
\ \"acc_norm_stderr\": 0.003105556631739391\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.016588680864530622,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.016588680864530622\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523369,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523369\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7725358601874959,\n\
\ \"mc2_stderr\": 0.013822624088036008\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571766\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \
\ \"acc_stderr\": 0.012661502663418698\n }\n}\n```"
repo_url: https://huggingface.co/automerger/NeuralsirkrishnaExperiment26-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|arc:challenge|25_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|gsm8k|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hellaswag|10_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T06-00-02.561751.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T06-00-02.561751.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- '**/details_harness|winogrande|5_2024-03-13T06-00-02.561751.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T06-00-02.561751.parquet'
- config_name: results
data_files:
- split: 2024_03_13T06_00_02.561751
path:
- results_2024-03-13T06-00-02.561751.parquet
- split: latest
path:
- results_2024-03-13T06-00-02.561751.parquet
---
# Dataset Card for Evaluation run of automerger/NeuralsirkrishnaExperiment26-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [automerger/NeuralsirkrishnaExperiment26-7B](https://huggingface.co/automerger/NeuralsirkrishnaExperiment26-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_automerger__NeuralsirkrishnaExperiment26-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T06:00:02.561751](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__NeuralsirkrishnaExperiment26-7B/blob/main/results_2024-03-13T06-00-02.561751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6497749407583714,
"acc_stderr": 0.03204087168819212,
"acc_norm": 0.6490637412748819,
"acc_norm_stderr": 0.03271053969931456,
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7725358601874959,
"mc2_stderr": 0.013822624088036008
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403513,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.012835523909473836
},
"harness|hellaswag|10": {
"acc": 0.7183827922724557,
"acc_stderr": 0.004488684397979498,
"acc_norm": 0.8913563035251942,
"acc_norm_stderr": 0.003105556631739391
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530622,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530622
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523369,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523369
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7725358601874959,
"mc2_stderr": 0.013822624088036008
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571766
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.012661502663418698
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
web_questions | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
paperswithcode_id: webquestions
pretty_name: WebQuestions
dataset_info:
features:
- name: url
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
splits:
- name: train
num_bytes: 530711
num_examples: 3778
- name: test
num_bytes: 288184
num_examples: 2032
download_size: 402395
dataset_size: 818895
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "web_questions"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://worksheets.codalab.org/worksheets/0xba659fe363cb46e7a505c5b6a774dc8a](https://worksheets.codalab.org/worksheets/0xba659fe363cb46e7a505c5b6a774dc8a)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [Semantic Parsing on Freebase from Question-Answer Pairs](https://aclanthology.org/D13-1160/)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.27 MB
- **Size of the generated dataset:** 0.83 MB
- **Total amount of disk used:** 2.10 MB
### Dataset Summary
This dataset consists of 6,642 question/answer pairs.
The questions are supposed to be answerable by Freebase, a large knowledge graph.
The questions are mostly centered around a single named entity.
The questions are popular ones asked on the web (at least in 2013).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 1.27 MB
- **Size of the generated dataset:** 0.83 MB
- **Total amount of disk used:** 2.10 MB
An example of 'train' looks as follows.
```
{
"answers": ["Jamaican Creole English Language", "Jamaican English"],
"question": "what does jamaican people speak?",
"url": "http://www.freebase.com/view/en/jamaica"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `url`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a `list` of `string` features.
### Data Splits
| name |train|test|
|-------|----:|---:|
|default| 3778|2032|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{berant-etal-2013-semantic,
title = "Semantic Parsing on {F}reebase from Question-Answer Pairs",
author = "Berant, Jonathan and
Chou, Andrew and
Frostig, Roy and
Liang, Percy",
booktitle = "Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing",
month = oct,
year = "2013",
address = "Seattle, Washington, USA",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D13-1160",
pages = "1533--1544",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@mariamabarham](https://github.com/mariamabarham), [@lewtun](https://github.com/lewtun) for adding this dataset. |
Jinho11/jinho_data_2023-11-19 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6329815352
num_examples: 6590
- name: test
num_bytes: 791467704
num_examples: 824
- name: valid
num_bytes: 791467136
num_examples: 824
download_size: 1164200108
dataset_size: 7912750192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
Nexdata/10100_Image_caption_data_of_human_face | ---
license: cc-by-nc-nd-4.0
---
## Description
20,000 Image caption data of human face includes multiple races under the age of 18, 18_45 years old, 46_60 years old, and over 60 years old; the collection scene is rich, including indoor scenes and outdoor scenes; the image content is rich, including wearing masks, glasses, wearing headphones, facial expressions, gestures, and adversarial examples. The language of the text description is English, which mainly describes the race, gender, age, shooting angle, lighting and diversity content, etc.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1286?source=Huggingface
## Data size
10,100 images
## Race distribution
Asian, Caucasian, Black, Brown
## Gender distribution
male, female
## Age distribution
under 18 years old, 18_45 years old, 46_60 years old, over 60 years old
## Collection environment
including indoor scenes and outdoor scenes
## Collection diversity
different age groups, different collection environments, and different seasons
## Diversity of content
including wearing masks, adversarial samples, expression data, wearing glasses, wearing headphones, and multiple gestures
## Data format
image format is .jpg, text format is .txt
## Description language
English, Chinese
## Text length
in principle, 30~60 words, usually 3-5 sentences
## Main description content
race, gender, age, shooting angle, lighting, diversity content
## Accuracy rate
the proportion of correctly labeled images is not less than 97%
# Licensing Information
Commercial License
|
haisonle001/full_sft_chat_data_filtered_final | ---
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 9893403961.87305
num_examples: 5301513
download_size: 5178718630
dataset_size: 9893403961.87305
---
# Dataset Card for "full_sft_chat_data_filtered_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/sergei-letov | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/sergei-letov"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.035123 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/a5717aec4301e2adfb464d3b85701f74.300x300x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/sergei-letov">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Сергей Летов (Sergei Letov)</div>
<a href="https://genius.com/artists/sergei-letov">
<div style="text-align: center; font-size: 14px;">@sergei-letov</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/sergei-letov).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/sergei-letov")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|7| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/sergei-letov")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
yzhuang/autotree_snnxor_n15_l2_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 402200000
num_examples: 10000
- name: validation
num_bytes: 402200000
num_examples: 10000
- name: test
num_bytes: 402200000
num_examples: 10000
download_size: 351932552
dataset_size: 1206600000
---
# Dataset Card for "autotree_snnxor_n15_l2_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kdcyberdude/wikipedia-pa-transliteration | ---
dataset_info:
features:
- name: id
dtype: int64
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: transliterated_text
dtype: string
- name: transliterated_title
dtype: string
splits:
- name: train
num_bytes: 311038383
num_examples: 51423
download_size: 137271151
dataset_size: 311038383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wikipedia-pa-transliteration"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelmallari/airbnb-usa-nc-asheville | ---
license: mit
---
|
datacrunch/finnish_alpaca | ---
license: mit
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 20402896
num_examples: 51715
download_size: 13168174
dataset_size: 20402896
---
|
gagan3012/finqa-updated | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 39975950
num_examples: 6251
- name: valid
num_bytes: 5555542
num_examples: 883
- name: test
num_bytes: 7204414
num_examples: 1147
download_size: 20874503
dataset_size: 52735906
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
alicata/data-001-ara-small | ---
license: unlicense
---
|
ITNovaML/invoices-donut-data-v1 | ---
task_categories:
- feature-extraction
language:
- en
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 235013794.0
num_examples: 426
- name: validation
num_bytes: 26678659.0
num_examples: 50
- name: test
num_bytes: 15053216.0
num_examples: 26
download_size: 197949185
dataset_size: 276745669.0
---
|
datahrvoje/twitter_dataset_1712989128 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26070
num_examples: 56
download_size: 13805
dataset_size: 26070
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_non_coordinated_subj_obj | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 6529
num_examples: 41
- name: test
num_bytes: 12152
num_examples: 74
- name: train
num_bytes: 158608
num_examples: 1310
download_size: 88003
dataset_size: 177289
---
# Dataset Card for "MULTI_VALUE_sst2_non_coordinated_subj_obj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
suzyanil/nba-data | ---
license: creativeml-openrail-m
---
|
bertbsb/Herbeebeto | ---
license: openrail
---
|
NLPC-UOM/Writing-style-classification | ---
annotations_creators: []
language_creators:
- crowdsourced
language:
- si
license:
- mit
multilinguality:
- monolingual
pretty_name: sinhala-writing-style-classification
size_categories: []
source_datasets: []
task_categories:
- text-classification
task_ids: []
---
This file contains news texts (sentences) belonging to different writing styles. The original dataset created by {*Upeksha, D., Wijayarathna, C., Siriwardena, M.,
Lasandun, L., Wimalasuriya, C., de Silva, N., and Dias, G. (2015). Implementing a corpus for Sinhala language. 01*}is processed and cleaned.
If you use this dataset, please cite {*Dhananjaya et al. BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, 2022*} and the above mentioned paper. |
eduagarcia/portuguese_benchmark | ---
language:
- pt
pretty_name: Portuguese Benchmark
dataset_info:
- config_name: HateBR_offensive_binary
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': non-offensive
'1': offensive
splits:
- name: train
num_bytes: 416208
num_examples: 4480
- name: validation
num_bytes: 94237
num_examples: 1120
- name: test
num_bytes: 116658
num_examples: 1400
download_size: 411947
dataset_size: 627103
- config_name: HateBR_offensive_level
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': non-offensive
'1': slightly
'2': moderately
'3': highly
splits:
- name: train
num_bytes: 416208
num_examples: 4480
- name: validation
num_bytes: 94237
num_examples: 1120
- name: test
num_bytes: 116658
num_examples: 1400
download_size: 413064
dataset_size: 627103
- config_name: LeNER-Br
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-ORGANIZACAO
'2': I-ORGANIZACAO
'3': B-PESSOA
'4': I-PESSOA
'5': B-TEMPO
'6': I-TEMPO
'7': B-LOCAL
'8': I-LOCAL
'9': B-LEGISLACAO
'10': I-LEGISLACAO
'11': B-JURISPRUDENCIA
'12': I-JURISPRUDENCIA
splits:
- name: train
num_bytes: 3953896
num_examples: 7825
- name: validation
num_bytes: 715819
num_examples: 1177
- name: test
num_bytes: 819242
num_examples: 1390
download_size: 1049906
dataset_size: 5488957
- config_name: Portuguese_Hate_Speech_binary
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': no-hate
'1': hate
splits:
- name: train
num_bytes: 473248
num_examples: 3969
- name: validation
num_bytes: 101358
num_examples: 850
- name: test
num_bytes: 101242
num_examples: 851
download_size: 482467
dataset_size: 675848
- config_name: UlyssesNER-Br-C-coarse
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-DATA
'2': I-DATA
'3': B-EVENTO
'4': I-EVENTO
'5': B-FUNDAMENTO
'6': I-FUNDAMENTO
'7': B-LOCAL
'8': I-LOCAL
'9': B-ORGANIZACAO
'10': I-ORGANIZACAO
'11': B-PESSOA
'12': I-PESSOA
'13': B-PRODUTODELEI
'14': I-PRODUTODELEI
splits:
- name: train
num_bytes: 1051410
num_examples: 679
- name: validation
num_bytes: 225883
num_examples: 146
- name: test
num_bytes: 226764
num_examples: 147
download_size: 301821
dataset_size: 1504057
- config_name: UlyssesNER-Br-C-fine
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-DATA
'2': I-DATA
'3': B-EVENTO
'4': I-EVENTO
'5': B-FUNDapelido
'6': I-FUNDapelido
'7': B-FUNDlei
'8': I-FUNDlei
'9': B-FUNDprojetodelei
'10': I-FUNDprojetodelei
'11': B-LOCALconcreto
'12': I-LOCALconcreto
'13': B-LOCALvirtual
'14': I-LOCALvirtual
'15': B-ORGgovernamental
'16': I-ORGgovernamental
'17': B-ORGnaogovernamental
'18': I-ORGnaogovernamental
'19': B-ORGpartido
'20': I-ORGpartido
'21': B-PESSOAcargo
'22': I-PESSOAcargo
'23': B-PESSOAgrupocargo
'24': I-PESSOAgrupocargo
'25': B-PESSOAgrupoind
'26': I-PESSOAgrupoind
'27': B-PESSOAindividual
'28': I-PESSOAindividual
'29': B-PRODUTOoutros
'30': I-PRODUTOoutros
'31': B-PRODUTOprograma
'32': I-PRODUTOprograma
'33': B-PRODUTOsistema
'34': I-PRODUTOsistema
splits:
- name: train
num_bytes: 1051410
num_examples: 679
- name: validation
num_bytes: 225883
num_examples: 146
- name: test
num_bytes: 226764
num_examples: 147
download_size: 305985
dataset_size: 1504057
- config_name: UlyssesNER-Br-PL-coarse
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-DATA
'2': I-DATA
'3': B-EVENTO
'4': I-EVENTO
'5': B-FUNDAMENTO
'6': I-FUNDAMENTO
'7': B-LOCAL
'8': I-LOCAL
'9': B-ORGANIZACAO
'10': I-ORGANIZACAO
'11': B-PESSOA
'12': I-PESSOA
'13': B-PRODUTODELEI
'14': I-PRODUTODELEI
splits:
- name: train
num_bytes: 1511905
num_examples: 2271
- name: validation
num_bytes: 305472
num_examples: 489
- name: test
num_bytes: 363207
num_examples: 524
download_size: 431964
dataset_size: 2180584
- config_name: UlyssesNER-Br-PL-fine
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-DATA
'2': I-DATA
'3': B-EVENTO
'4': I-EVENTO
'5': B-FUNDapelido
'6': I-FUNDapelido
'7': B-FUNDlei
'8': I-FUNDlei
'9': B-FUNDprojetodelei
'10': I-FUNDprojetodelei
'11': B-LOCALconcreto
'12': I-LOCALconcreto
'13': B-LOCALvirtual
'14': I-LOCALvirtual
'15': B-ORGgovernamental
'16': I-ORGgovernamental
'17': B-ORGnaogovernamental
'18': I-ORGnaogovernamental
'19': B-ORGpartido
'20': I-ORGpartido
'21': B-PESSOAcargo
'22': I-PESSOAcargo
'23': B-PESSOAgrupocargo
'24': I-PESSOAgrupocargo
'25': B-PESSOAindividual
'26': I-PESSOAindividual
'27': B-PRODUTOoutros
'28': I-PRODUTOoutros
'29': B-PRODUTOprograma
'30': I-PRODUTOprograma
'31': B-PRODUTOsistema
'32': I-PRODUTOsistema
splits:
- name: train
num_bytes: 1511905
num_examples: 2271
- name: validation
num_bytes: 305472
num_examples: 489
- name: test
num_bytes: 363207
num_examples: 524
download_size: 437232
dataset_size: 2180584
- config_name: assin2-rte
features:
- name: idx
dtype: int32
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': NONE
'1': ENTAILMENT
splits:
- name: train
num_bytes: 811995
num_examples: 6500
- name: validation
num_bytes: 62824
num_examples: 500
- name: test
num_bytes: 319682
num_examples: 2448
download_size: 551190
dataset_size: 1194501
- config_name: assin2-sts
features:
- name: idx
dtype: int32
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float32
splits:
- name: train
num_bytes: 785995
num_examples: 6500
- name: validation
num_bytes: 60824
num_examples: 500
- name: test
num_bytes: 309890
num_examples: 2448
download_size: 560263
dataset_size: 1156709
- config_name: brazilian_court_decisions_judgment
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'no'
'1': partial
'2': 'yes'
splits:
- name: train
num_bytes: 2779679
num_examples: 3234
- name: validation
num_bytes: 351504
num_examples: 404
- name: test
num_bytes: 346499
num_examples: 405
download_size: 1956183
dataset_size: 3477682
- config_name: brazilian_court_decisions_unanimity
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': unanimity
'1': not-unanimity
splits:
- name: train
num_bytes: 1564695
num_examples: 1715
- name: validation
num_bytes: 197865
num_examples: 211
- name: test
num_bytes: 193928
num_examples: 204
download_size: 1069780
dataset_size: 1956488
- config_name: harem-default
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PESSOA
'2': I-PESSOA
'3': B-ORGANIZACAO
'4': I-ORGANIZACAO
'5': B-LOCAL
'6': I-LOCAL
'7': B-TEMPO
'8': I-TEMPO
'9': B-VALOR
'10': I-VALOR
'11': B-ABSTRACCAO
'12': I-ABSTRACCAO
'13': B-ACONTECIMENTO
'14': I-ACONTECIMENTO
'15': B-COISA
'16': I-COISA
'17': B-OBRA
'18': I-OBRA
'19': B-OUTRO
'20': I-OUTRO
splits:
- name: train
num_bytes: 1504542
num_examples: 121
- name: validation
num_bytes: 51182
num_examples: 8
- name: test
num_bytes: 1060778
num_examples: 128
download_size: 540547
dataset_size: 2616502
- config_name: harem-selective
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PESSOA
'2': I-PESSOA
'3': B-ORGANIZACAO
'4': I-ORGANIZACAO
'5': B-LOCAL
'6': I-LOCAL
'7': B-TEMPO
'8': I-TEMPO
'9': B-VALOR
'10': I-VALOR
splits:
- name: train
num_bytes: 1504542
num_examples: 121
- name: validation
num_bytes: 51182
num_examples: 8
- name: test
num_bytes: 1060778
num_examples: 128
download_size: 531807
dataset_size: 2616502
- config_name: mapa_pt_coarse
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-ADDRESS
'2': I-ADDRESS
'3': B-AMOUNT
'4': I-AMOUNT
'5': B-DATE
'6': I-DATE
'7': B-ORGANISATION
'8': I-ORGANISATION
'9': B-PERSON
'10': I-PERSON
'11': B-TIME
'12': I-TIME
splits:
- name: train
num_bytes: 974822
num_examples: 1086
- name: validation
num_bytes: 119702
num_examples: 105
- name: test
num_bytes: 337141
num_examples: 390
download_size: 229263
dataset_size: 1431665
- config_name: mapa_pt_fine
features:
- name: idx
dtype: int32
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-AGE
'2': I-AGE
'3': B-BUILDING
'4': I-BUILDING
'5': B-CITY
'6': I-CITY
'7': B-COUNTRY
'8': I-COUNTRY
'9': B-DAY
'10': I-DAY
'11': B-ETHNIC CATEGORY
'12': I-ETHNIC CATEGORY
'13': B-FAMILY NAME
'14': I-FAMILY NAME
'15': B-INITIAL NAME
'16': I-INITIAL NAME
'17': B-MARITAL STATUS
'18': I-MARITAL STATUS
'19': B-MONTH
'20': I-MONTH
'21': B-NATIONALITY
'22': I-NATIONALITY
'23': B-PLACE
'24': I-PLACE
'25': B-PROFESSION
'26': I-PROFESSION
'27': B-ROLE
'28': I-ROLE
'29': B-STANDARD ABBREVIATION
'30': I-STANDARD ABBREVIATION
'31': B-TERRITORY
'32': I-TERRITORY
'33': B-TITLE
'34': I-TITLE
'35': B-TYPE
'36': I-TYPE
'37': B-UNIT
'38': I-UNIT
'39': B-URL
'40': I-URL
'41': B-VALUE
'42': I-VALUE
'43': B-YEAR
'44': I-YEAR
splits:
- name: train
num_bytes: 974822
num_examples: 1086
- name: validation
num_bytes: 119702
num_examples: 105
- name: test
num_bytes: 337141
num_examples: 390
download_size: 231886
dataset_size: 1431665
- config_name: multi_eurlex_pt
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: labels
sequence:
class_label:
names:
'0': '100149'
'1': '100160'
'2': '100148'
'3': '100147'
'4': '100152'
'5': '100143'
'6': '100156'
'7': '100158'
'8': '100154'
'9': '100153'
'10': '100142'
'11': '100145'
'12': '100150'
'13': '100162'
'14': '100159'
'15': '100144'
'16': '100151'
'17': '100157'
'18': '100161'
'19': '100146'
'20': '100155'
splits:
- name: train
num_bytes: 0
num_examples: 0
- name: validation
num_bytes: 0
num_examples: 0
- name: test
num_bytes: 0
num_examples: 0
download_size: 4770
dataset_size: 0
configs:
- config_name: HateBR_offensive_binary
data_files:
- split: train
path: HateBR_offensive_binary/train-*
- split: validation
path: HateBR_offensive_binary/validation-*
- split: test
path: HateBR_offensive_binary/test-*
- config_name: HateBR_offensive_level
data_files:
- split: train
path: HateBR_offensive_level/train-*
- split: validation
path: HateBR_offensive_level/validation-*
- split: test
path: HateBR_offensive_level/test-*
- config_name: LeNER-Br
data_files:
- split: train
path: LeNER-Br/train-*
- split: validation
path: LeNER-Br/validation-*
- split: test
path: LeNER-Br/test-*
- config_name: Portuguese_Hate_Speech_binary
data_files:
- split: train
path: Portuguese_Hate_Speech_binary/train-*
- split: validation
path: Portuguese_Hate_Speech_binary/validation-*
- split: test
path: Portuguese_Hate_Speech_binary/test-*
- config_name: UlyssesNER-Br-C-coarse
data_files:
- split: train
path: UlyssesNER-Br-C-coarse/train-*
- split: validation
path: UlyssesNER-Br-C-coarse/validation-*
- split: test
path: UlyssesNER-Br-C-coarse/test-*
- config_name: UlyssesNER-Br-C-fine
data_files:
- split: train
path: UlyssesNER-Br-C-fine/train-*
- split: validation
path: UlyssesNER-Br-C-fine/validation-*
- split: test
path: UlyssesNER-Br-C-fine/test-*
- config_name: UlyssesNER-Br-PL-coarse
data_files:
- split: train
path: UlyssesNER-Br-PL-coarse/train-*
- split: validation
path: UlyssesNER-Br-PL-coarse/validation-*
- split: test
path: UlyssesNER-Br-PL-coarse/test-*
- config_name: UlyssesNER-Br-PL-fine
data_files:
- split: train
path: UlyssesNER-Br-PL-fine/train-*
- split: validation
path: UlyssesNER-Br-PL-fine/validation-*
- split: test
path: UlyssesNER-Br-PL-fine/test-*
- config_name: assin2-rte
data_files:
- split: train
path: assin2-rte/train-*
- split: validation
path: assin2-rte/validation-*
- split: test
path: assin2-rte/test-*
- config_name: assin2-sts
data_files:
- split: train
path: assin2-sts/train-*
- split: validation
path: assin2-sts/validation-*
- split: test
path: assin2-sts/test-*
- config_name: brazilian_court_decisions_judgment
data_files:
- split: train
path: brazilian_court_decisions_judgment/train-*
- split: validation
path: brazilian_court_decisions_judgment/validation-*
- split: test
path: brazilian_court_decisions_judgment/test-*
- config_name: brazilian_court_decisions_unanimity
data_files:
- split: train
path: brazilian_court_decisions_unanimity/train-*
- split: validation
path: brazilian_court_decisions_unanimity/validation-*
- split: test
path: brazilian_court_decisions_unanimity/test-*
- config_name: harem-default
data_files:
- split: train
path: harem-default/train-*
- split: validation
path: harem-default/validation-*
- split: test
path: harem-default/test-*
- config_name: harem-selective
data_files:
- split: train
path: harem-selective/train-*
- split: validation
path: harem-selective/validation-*
- split: test
path: harem-selective/test-*
- config_name: mapa_pt_coarse
data_files:
- split: train
path: mapa_pt_coarse/train-*
- split: validation
path: mapa_pt_coarse/validation-*
- split: test
path: mapa_pt_coarse/test-*
- config_name: mapa_pt_fine
data_files:
- split: train
path: mapa_pt_fine/train-*
- split: validation
path: mapa_pt_fine/validation-*
- split: test
path: mapa_pt_fine/test-*
- config_name: multi_eurlex_pt
data_files:
- split: train
path: multi_eurlex_pt/train-*
- split: validation
path: multi_eurlex_pt/validation-*
- split: test
path: multi_eurlex_pt/test-*
---
# Portuguese Benchmark |
autoevaluate/autoeval-staging-eval-project-be45ecbd-7284773 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: echarlaix/bart-base-cnn-r2-19.4-d35-hybrid
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: echarlaix/bart-base-cnn-r2-19.4-d35-hybrid
* Dataset: cnn_dailymail
To run new evaluation jobs, visit Hugging Face's [automatic evaluation service](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
tyzhu/fwv2_random_rare_tip_train_1000_eval_100 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 360515
num_examples: 2100
- name: train_doc2id
num_bytes: 100243
num_examples: 1100
- name: train_id2doc
num_bytes: 103543
num_examples: 1100
- name: train_find_word
num_bytes: 256972
num_examples: 1000
- name: eval_find_word
num_bytes: 18440
num_examples: 100
- name: id_context_mapping
num_bytes: 68343
num_examples: 1100
download_size: 0
dataset_size: 908056
---
# Dataset Card for "fwv2_random_rare_tip_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NLP-proj/dataset1 | ---
license: unknown
---
|
Raihan004/All_10_Action | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': কথা_বলা
'1': কম্পিউটার_ব্যবহার_করা
'2': খাওয়া
'3': খেলা_করা
'4': ঘুমানো
'5': পড়া
'6': পান_করা
'7': রান্না_করা
'8': লেখা
'9': হাঁটা
splits:
- name: train
num_bytes: 450039362.261335
num_examples: 3972
- name: test
num_bytes: 64023200.75866496
num_examples: 702
download_size: 494658461
dataset_size: 514062563.02
---
# Dataset Card for "All_10_Action"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
piercemaloney/coqgym_coq_projects_v2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: finmap
num_bytes: 745110
num_examples: 3
- name: GeometricAlgebra
num_bytes: 2180457
num_examples: 8
- name: bdds
num_bytes: 11537326
num_examples: 15
- name: concat
num_bytes: 1876052
num_examples: 90
- name: topology
num_bytes: 1998914
num_examples: 32
- name: euler_formula
num_bytes: 2157257
num_examples: 3
- name: ruler_compass_geometry
num_bytes: 531800
num_examples: 39
- name: fcsl_pcm
num_bytes: 1911756
num_examples: 12
- name: twoSquare
num_bytes: 219009
num_examples: 2
- name: zfc
num_bytes: 605621
num_examples: 11
- name: shuffle
num_bytes: 87431
num_examples: 6
- name: metalib
num_bytes: 190046
num_examples: 11
- name: hardware
num_bytes: 185840
num_examples: 25
- name: three_gap
num_bytes: 315458
num_examples: 7
- name: coq_ext_lib
num_bytes: 311782
num_examples: 47
- name: cheerios
num_bytes: 140210
num_examples: 6
- name: regexp
num_bytes: 116907
num_examples: 7
- name: coq_library_undecidability
num_bytes: 3089099
num_examples: 93
- name: automata
num_bytes: 1075636
num_examples: 25
- name: coquelicot
num_bytes: 6616788
num_examples: 23
- name: izf
num_bytes: 146423
num_examples: 8
- name: lemma_overloading
num_bytes: 1075228
num_examples: 27
- name: lin_alg
num_bytes: 6760336
num_examples: 68
- name: railroad_crossing
num_bytes: 202543
num_examples: 1
- name: idxassoc
num_bytes: 139339
num_examples: 2
- name: hoare_tut
num_bytes: 39069
num_examples: 3
- name: lesniewski_mereology
num_bytes: 123636
num_examples: 2
- name: verdi
num_bytes: 12479953
num_examples: 28
- name: additions
num_bytes: 147048
num_examples: 20
- name: checker
num_bytes: 6689
num_examples: 2
- name: VST
num_bytes: 38787573
num_examples: 292
- name: domain_theory
num_bytes: 67545
num_examples: 4
- name: propcalc
num_bytes: 54422
num_examples: 5
- name: circuits
num_bytes: 399265
num_examples: 19
- name: CompCert
num_bytes: 14680669
num_examples: 142
- name: area_method
num_bytes: 1578799
num_examples: 38
- name: bbv
num_bytes: 705745
num_examples: 13
- name: ails
num_bytes: 521926
num_examples: 11
- name: dep_map
num_bytes: 24951
num_examples: 2
- name: ChargeCore
num_bytes: 273913
num_examples: 20
- name: markov
num_bytes: 102887
num_examples: 1
- name: rsa
num_bytes: 180155
num_examples: 5
- name: verdi_raft
num_bytes: 21320585
num_examples: 107
- name: goedel
num_bytes: 9110441
num_examples: 44
- name: bigenough
num_bytes: 5031
num_examples: 1
- name: generic_environments
num_bytes: 142529
num_examples: 2
- name: disel
num_bytes: 4787193
num_examples: 37
- name: ctltctl
num_bytes: 61038
num_examples: 3
- name: coq_list_string
num_bytes: 6409
num_examples: 3
- name: QuickChick
num_bytes: 718709
num_examples: 17
- name: schroeder
num_bytes: 24958
num_examples: 4
- name: lazy_pcf
num_bytes: 497224
num_examples: 13
- name: weak_up_to
num_bytes: 175365
num_examples: 10
- name: groups
num_bytes: 11115
num_examples: 1
- name: pocklington
num_bytes: 617483
num_examples: 13
- name: mini_compiler
num_bytes: 14459
num_examples: 1
- name: StructTact
num_bytes: 293887
num_examples: 17
- name: exceptions
num_bytes: 8962
num_examples: 1
- name: coqrel
num_bytes: 196290
num_examples: 12
- name: higman_s
num_bytes: 198793
num_examples: 5
- name: bellantonicook
num_bytes: 3852234
num_examples: 16
- name: rem
num_bytes: 14504
num_examples: 1
- name: tree_automata
num_bytes: 1972575
num_examples: 17
- name: coq_procrastination
num_bytes: 26763
num_examples: 1
- name: higman_cf
num_bytes: 49720
num_examples: 2
- name: GeoCoq
num_bytes: 12090435
num_examples: 328
- name: coqoban
num_bytes: 38285
num_examples: 1
- name: search_trees
num_bytes: 59332
num_examples: 5
- name: system
num_bytes: 2053
num_examples: 1
- name: ieee754
num_bytes: 35416
num_examples: 3
- name: jordan_curve_theorem
num_bytes: 12530912
num_examples: 10
- name: huffman
num_bytes: 1634130
num_examples: 25
- name: zf
num_bytes: 933118
num_examples: 10
- name: hedges
num_bytes: 360884
num_examples: 1
- name: zorns_lemma
num_bytes: 608752
num_examples: 19
- name: tortoise_hare_algorithm
num_bytes: 10084
num_examples: 1
- name: mod_red
num_bytes: 636981
num_examples: 5
- name: UnifySL
num_bytes: 1716822
num_examples: 128
- name: traversable_fincontainer
num_bytes: 429019
num_examples: 1
- name: buchberger
num_bytes: 2422607
num_examples: 29
- name: constructive_geometry
num_bytes: 80179
num_examples: 7
- name: tarski_geometry
num_bytes: 112419
num_examples: 8
- name: int_map
num_bytes: 817835
num_examples: 13
- name: float
num_bytes: 2994074
num_examples: 31
- name: InfSeqExt
num_bytes: 106656
num_examples: 5
- name: zchinese
num_bytes: 62626
num_examples: 6
- name: smc
num_bytes: 6045540
num_examples: 15
- name: pts
num_bytes: 72482
num_examples: 8
- name: param_pi
num_bytes: 2596347
num_examples: 11
- name: axiomatic_abp
num_bytes: 1204713
num_examples: 7
- name: lambda
num_bytes: 181085
num_examples: 10
- name: maths
num_bytes: 37685
num_examples: 3
- name: quicksort_complexity
num_bytes: 489164
num_examples: 28
- name: fssec_model
num_bytes: 1569135
num_examples: 25
- name: ipc
num_bytes: 3108901
num_examples: 31
- name: chinese
num_bytes: 208365
num_examples: 13
- name: cours_de_coq
num_bytes: 71295
num_examples: 11
- name: graphs
num_bytes: 644609
num_examples: 2
- name: dictionaries
num_bytes: 67746
num_examples: 1
- name: dblib
num_bytes: 195344
num_examples: 6
- name: cecoa
num_bytes: 2449356
num_examples: 14
- name: corespec
num_bytes: 1222339
num_examples: 28
- name: free_groups
num_bytes: 63973
num_examples: 1
- name: ramsey
num_bytes: 11734
num_examples: 1
- name: qarith
num_bytes: 51068
num_examples: 2
- name: math_comp
num_bytes: 44708646
num_examples: 76
- name: amm11262
num_bytes: 365079
num_examples: 5
- name: angles
num_bytes: 322579
num_examples: 5
- name: orb_stab
num_bytes: 204783
num_examples: 1
- name: qarith_stern_brocot
num_bytes: 10873352
num_examples: 35
- name: Categories
num_bytes: 4945
num_examples: 1
- name: group_theory
num_bytes: 104940
num_examples: 10
- name: demos
num_bytes: 62208
num_examples: 5
- name: distributed_reference_counting
num_bytes: 8888400
num_examples: 74
- name: subst
num_bytes: 362195
num_examples: 17
- name: miniml
num_bytes: 114099
num_examples: 1
- name: algebra
num_bytes: 3275753
num_examples: 65
- name: fermat4
num_bytes: 172156
num_examples: 5
- name: otway_rees
num_bytes: 226052
num_examples: 19
- name: SCEV_coq
num_bytes: 89902
num_examples: 1
- name: PolTac
num_bytes: 157370
num_examples: 13
- name: fundamental_arithmetics
num_bytes: 308733
num_examples: 8
download_size: 37361889
dataset_size: 286711572
configs:
- config_name: default
data_files:
- split: finmap
path: data/finmap-*
- split: GeometricAlgebra
path: data/GeometricAlgebra-*
- split: bdds
path: data/bdds-*
- split: concat
path: data/concat-*
- split: topology
path: data/topology-*
- split: euler_formula
path: data/euler_formula-*
- split: ruler_compass_geometry
path: data/ruler_compass_geometry-*
- split: fcsl_pcm
path: data/fcsl_pcm-*
- split: twoSquare
path: data/twoSquare-*
- split: zfc
path: data/zfc-*
- split: shuffle
path: data/shuffle-*
- split: metalib
path: data/metalib-*
- split: hardware
path: data/hardware-*
- split: three_gap
path: data/three_gap-*
- split: coq_ext_lib
path: data/coq_ext_lib-*
- split: cheerios
path: data/cheerios-*
- split: regexp
path: data/regexp-*
- split: coq_library_undecidability
path: data/coq_library_undecidability-*
- split: automata
path: data/automata-*
- split: coquelicot
path: data/coquelicot-*
- split: izf
path: data/izf-*
- split: lemma_overloading
path: data/lemma_overloading-*
- split: lin_alg
path: data/lin_alg-*
- split: railroad_crossing
path: data/railroad_crossing-*
- split: idxassoc
path: data/idxassoc-*
- split: hoare_tut
path: data/hoare_tut-*
- split: lesniewski_mereology
path: data/lesniewski_mereology-*
- split: verdi
path: data/verdi-*
- split: additions
path: data/additions-*
- split: checker
path: data/checker-*
- split: VST
path: data/VST-*
- split: domain_theory
path: data/domain_theory-*
- split: propcalc
path: data/propcalc-*
- split: circuits
path: data/circuits-*
- split: CompCert
path: data/CompCert-*
- split: area_method
path: data/area_method-*
- split: bbv
path: data/bbv-*
- split: ails
path: data/ails-*
- split: dep_map
path: data/dep_map-*
- split: ChargeCore
path: data/ChargeCore-*
- split: markov
path: data/markov-*
- split: rsa
path: data/rsa-*
- split: verdi_raft
path: data/verdi_raft-*
- split: goedel
path: data/goedel-*
- split: bigenough
path: data/bigenough-*
- split: generic_environments
path: data/generic_environments-*
- split: disel
path: data/disel-*
- split: ctltctl
path: data/ctltctl-*
- split: coq_list_string
path: data/coq_list_string-*
- split: QuickChick
path: data/QuickChick-*
- split: schroeder
path: data/schroeder-*
- split: lazy_pcf
path: data/lazy_pcf-*
- split: weak_up_to
path: data/weak_up_to-*
- split: groups
path: data/groups-*
- split: pocklington
path: data/pocklington-*
- split: mini_compiler
path: data/mini_compiler-*
- split: StructTact
path: data/StructTact-*
- split: exceptions
path: data/exceptions-*
- split: coqrel
path: data/coqrel-*
- split: higman_s
path: data/higman_s-*
- split: bellantonicook
path: data/bellantonicook-*
- split: rem
path: data/rem-*
- split: tree_automata
path: data/tree_automata-*
- split: coq_procrastination
path: data/coq_procrastination-*
- split: higman_cf
path: data/higman_cf-*
- split: GeoCoq
path: data/GeoCoq-*
- split: coqoban
path: data/coqoban-*
- split: search_trees
path: data/search_trees-*
- split: system
path: data/system-*
- split: ieee754
path: data/ieee754-*
- split: jordan_curve_theorem
path: data/jordan_curve_theorem-*
- split: huffman
path: data/huffman-*
- split: zf
path: data/zf-*
- split: hedges
path: data/hedges-*
- split: zorns_lemma
path: data/zorns_lemma-*
- split: tortoise_hare_algorithm
path: data/tortoise_hare_algorithm-*
- split: mod_red
path: data/mod_red-*
- split: UnifySL
path: data/UnifySL-*
- split: traversable_fincontainer
path: data/traversable_fincontainer-*
- split: buchberger
path: data/buchberger-*
- split: constructive_geometry
path: data/constructive_geometry-*
- split: tarski_geometry
path: data/tarski_geometry-*
- split: int_map
path: data/int_map-*
- split: float
path: data/float-*
- split: InfSeqExt
path: data/InfSeqExt-*
- split: zchinese
path: data/zchinese-*
- split: smc
path: data/smc-*
- split: pts
path: data/pts-*
- split: param_pi
path: data/param_pi-*
- split: axiomatic_abp
path: data/axiomatic_abp-*
- split: lambda
path: data/lambda-*
- split: maths
path: data/maths-*
- split: quicksort_complexity
path: data/quicksort_complexity-*
- split: fssec_model
path: data/fssec_model-*
- split: ipc
path: data/ipc-*
- split: chinese
path: data/chinese-*
- split: cours_de_coq
path: data/cours_de_coq-*
- split: graphs
path: data/graphs-*
- split: dictionaries
path: data/dictionaries-*
- split: dblib
path: data/dblib-*
- split: cecoa
path: data/cecoa-*
- split: corespec
path: data/corespec-*
- split: free_groups
path: data/free_groups-*
- split: ramsey
path: data/ramsey-*
- split: qarith
path: data/qarith-*
- split: math_comp
path: data/math_comp-*
- split: amm11262
path: data/amm11262-*
- split: angles
path: data/angles-*
- split: orb_stab
path: data/orb_stab-*
- split: qarith_stern_brocot
path: data/qarith_stern_brocot-*
- split: Categories
path: data/Categories-*
- split: group_theory
path: data/group_theory-*
- split: demos
path: data/demos-*
- split: distributed_reference_counting
path: data/distributed_reference_counting-*
- split: subst
path: data/subst-*
- split: miniml
path: data/miniml-*
- split: algebra
path: data/algebra-*
- split: fermat4
path: data/fermat4-*
- split: otway_rees
path: data/otway_rees-*
- split: SCEV_coq
path: data/SCEV_coq-*
- split: PolTac
path: data/PolTac-*
- split: fundamental_arithmetics
path: data/fundamental_arithmetics-*
---
|
open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.41 | ---
pretty_name: Evaluation run of liminerity/Blur-7b-slerp-v1.41
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Blur-7b-slerp-v1.41](https://huggingface.co/liminerity/Blur-7b-slerp-v1.41)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.41\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T15:46:41.028192](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.41/blob/main/results_2024-02-29T15-46-41.028192.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543777066128611,\n\
\ \"acc_stderr\": 0.03190391340927578,\n \"acc_norm\": 0.653816439134923,\n\
\ \"acc_norm_stderr\": 0.032568943473454945,\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.742326366705102,\n\
\ \"mc2_stderr\": 0.014265565473632048\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n\
\ \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7051384186417048,\n\
\ \"acc_stderr\": 0.004550486186019073,\n \"acc_norm\": 0.8864767974507071,\n\
\ \"acc_norm_stderr\": 0.0031658294884891833\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.039439666991836285,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.039439666991836285\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n\
\ \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n\
\ \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n\
\ \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\"\
: {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n\
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.742326366705102,\n\
\ \"mc2_stderr\": 0.014265565473632048\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7149355572403336,\n \
\ \"acc_stderr\": 0.012435042334904006\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Blur-7b-slerp-v1.41
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|arc:challenge|25_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|gsm8k|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hellaswag|10_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-46-41.028192.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T15-46-41.028192.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- '**/details_harness|winogrande|5_2024-02-29T15-46-41.028192.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T15-46-41.028192.parquet'
- config_name: results
data_files:
- split: 2024_02_29T15_46_41.028192
path:
- results_2024-02-29T15-46-41.028192.parquet
- split: latest
path:
- results_2024-02-29T15-46-41.028192.parquet
---
# Dataset Card for Evaluation run of liminerity/Blur-7b-slerp-v1.41
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-slerp-v1.41](https://huggingface.co/liminerity/Blur-7b-slerp-v1.41) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.41",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T15:46:41.028192](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.41/blob/main/results_2024-02-29T15-46-41.028192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543777066128611,
"acc_stderr": 0.03190391340927578,
"acc_norm": 0.653816439134923,
"acc_norm_stderr": 0.032568943473454945,
"mc1": 0.5924112607099143,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.742326366705102,
"mc2_stderr": 0.014265565473632048
},
"harness|arc:challenge|25": {
"acc": 0.697098976109215,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7051384186417048,
"acc_stderr": 0.004550486186019073,
"acc_norm": 0.8864767974507071,
"acc_norm_stderr": 0.0031658294884891833
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.039439666991836285,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.039439666991836285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5924112607099143,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.742326366705102,
"mc2_stderr": 0.014265565473632048
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785722
},
"harness|gsm8k|5": {
"acc": 0.7149355572403336,
"acc_stderr": 0.012435042334904006
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
azain/test2 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcript
dtype: string
- name: speaker_id
dtype: string
splits:
- name: train
num_bytes: 4231636.0
num_examples: 10
download_size: 4224152
dataset_size: 4231636.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.