datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
vishruthnath/Calc-svamp-Tagged | ---
language:
- en
dataset_info:
features:
- name: chain
dtype: string
- name: equation
dtype: string
- name: id
dtype: string
- name: num_unique_ops
dtype: int64
- name: operand
sequence: float64
- name: operand_tags
sequence: int64
- name: operation
dtype: string
- name: problem_type
dtype: string
- name: question
dtype: string
- name: question_split
sequence: string
- name: result
dtype: string
- name: result_float
dtype: float64
- name: valid
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: test
num_bytes: 723143
num_examples: 811
download_size: 156032
dataset_size: 723143
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
rjds0207/Beto | ---
license: openrail
---
|
georgiyozhegov/habr | ---
license: mit
task_categories:
- text-generation
language:
- ru
- en
pretty_name: habr
--- |
conjunct/rps_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': paper
'1': rock
'2': scissors
'3': none
- name: src
dtype:
class_label:
names:
'0': original
'1': kaggle
'2': blank
- name: annotated
dtype: image
- name: landmarks_old
dtype: string
- name: pca
dtype: 'null'
- name: landmarks
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1391904290.5
num_examples: 9170
- name: test
num_bytes: 311273298.25
num_examples: 2035
download_size: 1699467853
dataset_size: 1703177588.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Rami/prompts | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: views
dtype: int64
- name: instruction
dtype: string
- name: category
dtype: string
- name: view
dtype: int64
- name: completed_text
dtype: string
splits:
- name: train
num_bytes: 530466
num_examples: 390
download_size: 198596
dataset_size: 530466
---
# Dataset Card for "prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mkja/hnyta | ---
license: artistic-2.0
---
|
oroikon/chart_captioning | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 395695728.546
num_examples: 7057
- name: test
num_bytes: 48381523.0
num_examples: 882
- name: validation
num_bytes: 48266912.0
num_examples: 883
download_size: 480469420
dataset_size: 492344163.546
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "chart_captioning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Suqqaro/mokcomedy-voice | ---
license: unknown
---
|
hacktoberfest-corpus-es/newyorker_caption_contest_spanish | ---
license: cc-by-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: contest_number
dtype: int32
- name: image_location
dtype: string
- name: image_description
dtype: string
- name: image_uncanny_description
dtype: string
- name: entities
sequence: string
- name: questions
sequence: string
- name: caption_choices
dtype: string
- name: from_description
dtype: string
- name: label
dtype: string
- name: n_tokens_label
dtype: int32
- name: instance_id
dtype: string
splits:
- name: train
num_bytes: 134115134.64
num_examples: 2340
- name: validation
num_bytes: 8055329.0
num_examples: 130
- name: test
num_bytes: 6878764.0
num_examples: 131
download_size: 139896532
dataset_size: 149049227.64
---
|
CyberHarem/illyasviel_von_einzbern_fatekaleidlinerprismaillya | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Illyasviel von Einzbern
This is the dataset of Illyasviel von Einzbern, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 686 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 686 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 686 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 686 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
DavidMOBrien/sample-sklearn | ---
dataset_info:
features:
- name: before
dtype: string
- name: after
dtype: string
splits:
- name: train
num_bytes: 5538359.416666667
num_examples: 998
- name: test
num_bytes: 693682.2916666666
num_examples: 125
- name: valid
num_bytes: 693682.2916666666
num_examples: 125
download_size: 2782945
dataset_size: 6925724.000000001
---
# Dataset Card for "sample-sklearn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nayohan/fmt-bench | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: eval_indicator
dtype: string
splits:
- name: test
num_bytes: 47700
num_examples: 80
download_size: 0
dataset_size: 47700
---
# Dataset Card for "fmt-bench"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/based_1 | ---
dataset_info:
features:
- name: human
dtype: string
- name: bot
dtype: string
splits:
- name: train
num_bytes: 50290
num_examples: 176
download_size: 36285
dataset_size: 50290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "based_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thimira/sinhala-llama-2-data-format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 130696734.22033107
num_examples: 30879
- name: test
num_bytes: 32683706.77966892
num_examples: 7722
download_size: 65325000
dataset_size: 163380441.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
hs-knowledge/hatecheck-enriched | ---
dataset_info:
features:
- name: functionality
dtype: string
- name: case_id
dtype: int64
- name: test_case
dtype: string
- name: label_gold
dtype: string
- name: target_ident
dtype: string
- name: direction
dtype: string
- name: focus_words
dtype: string
- name: focus_lemma
dtype: string
- name: ref_case_id
dtype: float64
- name: ref_templ_id
dtype: float64
- name: templ_id
dtype: int64
- name: case_templ
dtype: string
- name: ner_output
struct:
- name: entities
list:
- name: end
dtype: int64
- name: kg_results
struct:
- name: '@context'
struct:
- name: '@vocab'
dtype: string
- name: EntitySearchResult
dtype: string
- name: detailedDescription
dtype: string
- name: goog
dtype: string
- name: kg
dtype: string
- name: resultScore
dtype: string
- name: '@type'
dtype: string
- name: itemListElement
list:
- name: '@type'
dtype: string
- name: result
struct:
- name: '@id'
dtype: string
- name: '@type'
sequence: string
- name: description
dtype: string
- name: detailedDescription
struct:
- name: articleBody
dtype: string
- name: license
dtype: string
- name: url
dtype: string
- name: image
struct:
- name: contentUrl
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: url
dtype: string
- name: resultScore
dtype: float64
- name: wikidata_id
dtype: string
- name: query_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: type
dtype: string
- name: labels
sequence: string
- name: sentence
dtype: string
- name: tokens
sequence: string
- name: entities
list:
- name: '@type'
dtype: string
- name: end
dtype: int64
- name: kg_result
struct:
- name: '@id'
dtype: string
- name: '@type'
sequence: string
- name: description
dtype: string
- name: detailedDescription
struct:
- name: articleBody
dtype: string
- name: license
dtype: string
- name: url
dtype: string
- name: image
struct:
- name: contentUrl
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: url
dtype: string
- name: resultScore
dtype: float64
- name: score
dtype: float64
- name: similarity
dtype: float64
- name: start
dtype: int64
- name: text
dtype: string
- name: type
dtype: string
- name: wikidata_id
dtype: string
splits:
- name: test
num_bytes: 1647429
num_examples: 3728
download_size: 392671
dataset_size: 1647429
---
# Dataset Card for "hatecheck-enriched"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adalib/monkey-cond-gen-sub-30 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: api
dtype: string
splits:
- name: train
num_bytes: 14566768
num_examples: 1021
download_size: 5006187
dataset_size: 14566768
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chathuru/cicids2018 | ---
task_categories:
- text-classification
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 31000
num_examples: 96
- name: test
num_bytes: 7655
num_examples: 24
download_size: 22022
dataset_size: 38655
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
tags:
- Cybersecurity
- Network Intrusion Detection
- CIC-IDS2018
---
|
EliasKD/test-dataset-repo | ---
dataset_info:
features:
- name: text
dtype: string
- name: Is related to Tech
dtype: string
- name: Sentiment
dtype: string
- name: Not related to 5BIG
dtype: string
splits:
- name: train
num_bytes: 776
num_examples: 6
download_size: 3395
dataset_size: 776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hynky/TSP-QA | ---
license: apache-2.0
---
# TSP-QA
The TSP-QA dataset is a collection of questions from the TSP (Testy studijních předpokladů) exams, which are used for admission to Masaryk University in the Czech Republic. The dataset contains questions from the following categories:
- Culture
- Verbal
- Critical
- Analytical
# Accessing the Dataset
The TSP-QA dataset is not publicly available. The author of the dataset has reached out to [Masaryk University](https://www.muni.cz/) for permission to publish the dataset, but was rejected.
# Why the Code for Scraping the Dataset is Not Available
The author of the dataset wanted to make the code for scraping the dataset available, but Masaryk University removed most of the data from their website after the author's request. As a result, there is no way to scrape the data.
However it's possible that will be possible to scrape the data soon. If you are interested in the current state of availability prompt me at kydlicek.hynek@gmail.com
# What's Inside Each Category of TSP-QA?
The following are examples of questions from each category of the TSP-QA dataset:
### Culture
**Jedinou ... seznamu světového dědictví UNESCO je ?**
**Britský badatel nebyl vystudovaným přírodovědcem, ale takřka pět let trvající plavba kolem ... O co se jednalo?**
### Verbal
**V jednom z afrických jazyků lze následující české věty vyjádřit takto: ... Určete v následujících příkladech ten, který v tomto africkém jazyce odpovídá českému přítomnému času.**
**Jsou-li pravdivé tyto dvě věty: ... vyberte, které z následujících souvětí je také nutně pravdivé:**
### Critical
**V populaci země X je 10 % diabetiků ... Jaká je pravděpodobnost, že tento člověk má současně i cévní onemocnění?**
**V lednici je pomocí termostatu udržována stálá teplota 0 ℃. Do lednice ... Vyberte správný závěr resp. argumentaci.**
### Analytical
**Třetina trasy z A do B měří ... Kolik měří polovina trasy z A do B na mapě s měřítkem 1 : 750 000?**
**Tvrzení: „Jestliže mi ... Vyberte opačné tvrzení.**
The following table shows the number of questions in each category of the TSP-QA dataset:

|
fractalego/wafl-functions-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: memory
dtype: string
- name: rules
dtype: string
- name: positive_conversation
dtype: string
- name: negative_conversation
dtype: string
splits:
- name: train
num_bytes: 4428113
num_examples: 981
download_size: 2064976
dataset_size: 4428113
---
# Dataset Card for "wafl-functions-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gamegyu/guanaco-llama2-100 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 142459
num_examples: 100
download_size: 91410
dataset_size: 142459
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VirtualRoyalty/20ng_not_enough_data | ---
task_categories:
- text-classification
language:
- en
pretty_name: 20ng_not_enough_data
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
weaviate/WithoutRetrieval-SchemaSplit-Test-40 | ---
license: apache-2.0
---
|
recmeapp/AARSynth | ---
license: cc-by-4.0
---
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- https://github.com/AARSynth/Dataset
- **Repository:**
- https://github.com/AARSynth/Dataset
- **Paper:**
- App-Aware Response Synthesis for User Reviews.
Umar Farooq, A.B. Siddique, Fuad Jamour, Zahijia Zhao and Vagelis Hristidis, “App-Aware Response Synthesis for User Reviews,” 2020 IEEE International Conference on Big Data (Big Data), 2020, pp. 699-708, DOI: https://doi.org/10.1109/BigData50022.2020.9377983.
- **Point of Contact:**
- Umar Farooq (ufarooq.cs@gmail.com)
- Abubakar Siddique (abubakar.ucr@gmail.com)
### Dataset Summary
AARSynth is a large-scale app review dataset. There are 570K review-response pairs and more than 2 million user
reviews for 103 popular applications.
### Supported Tasks and Leaderboards
Question Answer
Response Generation
### Languages
English
## How to use the dataset?
```
from datasets import load_dataset
import pandas as pd
# load the dataset
mbr_data = load_dataset('recmeapp/AARSynth', data_dir='replies')
# Save dataset to .csv file for creating pandas dataframe
mbr_data['train'].to_csv('./mbr_data.csv', sep='***')
# Convert to pandas dataframe
aarsynth_df = pd.read_csv('./mbr_data.csv', sep='***')
# How many interactions are there in the AARSynth dataset?
print(f'There are {len(aarsynth_df)} interactions in AARSynth dataset.')
```
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Umar Farooq and A.B. Siddique
### Licensing Information
[More Information Needed]
### Citation Information
- App-Aware Response Synthesis for User Reviews.
Umar Farooq, A.B. Siddique, Fuad Jamour, Zahijia Zhao and Vagelis Hristidis, “App-Aware Response Synthesis for User Reviews,” 2020 IEEE International Conference on Big Data (Big Data), 2020, pp. 699-708, DOI: https://doi.org/10.1109/BigData50022.2020.9377983.
### Contributions
[More Information Needed] |
sainv/Multilingual_T2I_clean_llama2_templated_prompts | ---
license: apache-2.0
---
|
AkiraChisaka/sizefetish-jp2cn-translated-text | ---
language:
- zh
- ja
---
|
daniae/cuevana3 | ---
license: artistic-2.0
---
|
abdulhade/AsosoftWhisperv2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 28038699208
num_examples: 29188
download_size: 4307818668
dataset_size: 28038699208
---
# Dataset Card for "AsosoftWhisperv2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chatty123__mistral_rank32_invert | ---
pretty_name: Evaluation run of chatty123/mistral_rank32_invert
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chatty123/mistral_rank32_invert](https://huggingface.co/chatty123/mistral_rank32_invert)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chatty123__mistral_rank32_invert\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T18:56:20.691515](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank32_invert/blob/main/results_2024-04-15T18-56-20.691515.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5963459156241223,\n\
\ \"acc_stderr\": 0.03334804643106429,\n \"acc_norm\": 0.6017021726171937,\n\
\ \"acc_norm_stderr\": 0.03403695297458732,\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5617709613227319,\n\
\ \"mc2_stderr\": 0.015145399162234476\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.0146028783885366,\n\
\ \"acc_norm\": 0.5571672354948806,\n \"acc_norm_stderr\": 0.014515573873348892\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6074487153953396,\n\
\ \"acc_stderr\": 0.004873203269366312,\n \"acc_norm\": 0.8119896434973113,\n\
\ \"acc_norm_stderr\": 0.0038992191786572285\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.037038511930995215,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.037038511930995215\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397467,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n\
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.032361983509282745,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.032361983509282745\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.015046301846691814,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.015046301846691814\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531015,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531015\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.015318257745976708,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.015318257745976708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.02718449890994161,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.02718449890994161\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011617,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011617\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.012618204066588389,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.012618204066588389\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829163,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829163\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5617709613227319,\n\
\ \"mc2_stderr\": 0.015145399162234476\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836673\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35178165276724793,\n \
\ \"acc_stderr\": 0.013153446023536033\n }\n}\n```"
repo_url: https://huggingface.co/chatty123/mistral_rank32_invert
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|arc:challenge|25_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|gsm8k|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hellaswag|10_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-56-20.691515.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T18-56-20.691515.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- '**/details_harness|winogrande|5_2024-04-15T18-56-20.691515.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T18-56-20.691515.parquet'
- config_name: results
data_files:
- split: 2024_04_15T18_56_20.691515
path:
- results_2024-04-15T18-56-20.691515.parquet
- split: latest
path:
- results_2024-04-15T18-56-20.691515.parquet
---
# Dataset Card for Evaluation run of chatty123/mistral_rank32_invert
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chatty123/mistral_rank32_invert](https://huggingface.co/chatty123/mistral_rank32_invert) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chatty123__mistral_rank32_invert",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T18:56:20.691515](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank32_invert/blob/main/results_2024-04-15T18-56-20.691515.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5963459156241223,
"acc_stderr": 0.03334804643106429,
"acc_norm": 0.6017021726171937,
"acc_norm_stderr": 0.03403695297458732,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5617709613227319,
"mc2_stderr": 0.015145399162234476
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.0146028783885366,
"acc_norm": 0.5571672354948806,
"acc_norm_stderr": 0.014515573873348892
},
"harness|hellaswag|10": {
"acc": 0.6074487153953396,
"acc_stderr": 0.004873203269366312,
"acc_norm": 0.8119896434973113,
"acc_norm_stderr": 0.0038992191786572285
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.037038511930995215,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.037038511930995215
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397467,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.032361983509282745,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.032361983509282745
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316561,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316561
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691814,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691814
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531015,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976708,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.02718449890994161,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.02718449890994161
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011617,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011617
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.012618204066588389,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.012618204066588389
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829163,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829163
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5617709613227319,
"mc2_stderr": 0.015145399162234476
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836673
},
"harness|gsm8k|5": {
"acc": 0.35178165276724793,
"acc_stderr": 0.013153446023536033
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
OpenAssistant/OASST-DE | ---
dataset_info:
features:
- name: conversation
list:
- name: role
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8022604.792326268
num_examples: 3721
download_size: 4325950
dataset_size: 8022604.792326268
license: apache-2.0
language:
- de
size_categories:
- 1K<n<10K
---
# German OpenAssistant Conversations Dataset (OASST-DE)
With the goal of advancing open-source, german-language LLM research, we present
OASST-DE: a high quality subset of a recent (25.08.23) dump from the [OpenAssistant website](https://www.open-assistant.io/)
translated to German using the GPT-3.5 API. More details on how the dataset was filtered and translated under [dataset creation.](#dataset-creation-process)
For more details on the OpenAssistant Project, look at the [first OASST dataset (OASST1)](https://huggingface.co/datasets/OpenAssistant/oasst1), [the Open-Assistant GitHub repo](https://github.com/LAION-AI/Open-Assistant)
or [our paper](https://arxiv.org/abs/2304.07327).
This dataset was created as part of LAION's LeoLM (Linguistically Enhanced Open Language Model) project led by Björn Plüster.
Check out LeoLM-Chat trained with OASST-DE ([7b](https://huggingface.co/LeoLM/leo-hessianai-7b-chat), [13b](https://huggingface.co/LeoLM/leo-hessianai-13b-chat)) finetuned on OASST-DE and read [their blog post](https://laion.ai/blog/leo-lm/)) for more info on LeoLM.
## Dataset Creation Process
This dataset was created from a recent OASST dump by following these steps:
- Filter for Top1 response trees with assistant response leaves
- Filter first prompt quality >= 0.5
- Filter total conversation length < 1900 tokens to fit in GPT3.5 context length
- Filter for `'lang' == 'de'` -> add to dataset
- Filter for `'lang' == 'en'` (other languages often result in failed translations)
- Translate using GPT-3.5-turbo API (total cost ~15$).
This results in around 3.7k samples of high-quality assistant conversations.
## Dataset Structure
This dataset has only one `'conversation'` field. Each example is a list of an alternating conversation between `'prompter'` and `'assistant'`,
where each entry is a dict with `'text'` and `'role'` fields:
```json
"conversation": [
{"role": "prompter", "text": "Moin, wie geht's dir?"},
{"role": "assistant", "text": "Moin Moin! Mir geht es gut, und dir?"},
...
]
```
## Usage with 🤗Datasets:
```python
from datasets import load_dataset
ds = load_dataset("OpenAssistant/OASST-DE", split="train")
print(ds[0]["conversation"])
``` |
CJWeiss/multilexsum | ---
dataset_info:
features:
- name: id
dtype: string
- name: sources
sequence: string
- name: summary/long
dtype: string
- name: summary/short
dtype: string
- name: summary/tiny
dtype: string
splits:
- name: train
num_bytes: 1381375968
num_examples: 3404
- name: test
num_bytes: 265556706
num_examples: 681
- name: valid
num_bytes: 199444854
num_examples: 454
download_size: 833868199
dataset_size: 1846377528
---
# Dataset Card for "multilexsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-d60b4e7e-7574884 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: Leizhang/xlm-roberta-base-finetuned-panx-de
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.de
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Leizhang/xlm-roberta-base-finetuned-panx-de
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
adambuttrick/100K-ner-indexes-multiple-organizations-locations-alpaca-format-json-response-all-cases | ---
license: cc0-1.0
---
|
felfri/MAGBIG | ---
license: apache-2.0
---
# MAGBIG benchmark
This is the MAGBIG benchmark proposed in https://arxiv.org/abs/2401.16092
This benchmark is intended for multilingual text-to-image models. With MAGBIG, you can generate images for a diverse set of prompts across ten different languages. These images can be evaluated for differences across languages. MAGBIG is designed to uncover and assess biases across languages such as gender, race, age, etc. This way, we can measure whether bias exists in a language, but also if it is consistent across languages or not. Yet, this benchmark can also be used to evaluate multilingual text-to-image models for their general ability across languages, e.g. image quality, versatility, etc.
It contains four sets of prompts in four files for English, German, Arabic, French, Italian, Spanish, Russian, Korean, Japanese and Chinese language:
1) adjective prompts (`adjectives.csv`) that use standard language
2) direct occupation prompts (`occupations_direct.csv`) that use standard language
3) indirect occupation prompts (`occupations_indirect.csv`) that use gender-neutral descriptions
You can also find our code available at https://github.com/felifri/MAGBIG
### Citation
If you like or use our work, please consider citing us.
```
@misc{friedrich2024multilingual,
title={Multilingual Text-to-Image Generation Magnifies Gender Stereotypes and Prompt Engineering May Not Help You},
author={Felix Friedrich and Katharina Hämmerl and Patrick Schramowski and Jindrich Libovicky and Kristian Kersting and Alexander Fraser},
year={2024},
eprint={2401.16092},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
generativeaidemo/generadai-sample | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: price
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 1905
num_examples: 5
download_size: 5923
dataset_size: 1905
---
# Dataset Card for "generadai-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/metatree_wind | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 610104
num_examples: 4622
- name: validation
num_bytes: 257664
num_examples: 1952
download_size: 180088
dataset_size: 867768
---
# Dataset Card for "metatree_wind"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Paulitos/school-math-questions-llama2-pt-br | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5198707
num_examples: 8792
download_size: 2587864
dataset_size: 5198707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SicariusSicariiStuff/MATH_ADD | ---
license: apache-2.0
---
|
rmarcosg/bark-detection | ---
license: apache-2.0
task_categories:
- audio-classification
tags:
- biology
pretty_name: Bark detection
size_categories:
- 10K<n<100K
---
# Bark detection dataset
## Dataset Description
This dataset comprises both positive and negative samples of audio of 1 second in WAV format, recorded at 44.1kHz.
Negative samples include music, voice, claps, whistles and vacuum cleaner noise, among other sound you may record inside a house.
Caveats:
- This is an imbalanced dataset: ~10k negatives vs ~500 positives.
- Positive samples may include human generated barks.
- Some (few) positive samples are false positives.
## Example usage
```
from datasets import load_dataset
dataset = load_dataset("rmarcosg/bark-detection", split="train")
```
## Supported Tasks
- audio-classification: This dataset was created for the task of binary classification.
## Dataset Structure
```
{
'file_id': '20221203_185908.wav',
'audio': {
'path': 'train/no/20221203_185908.wav',
'array': array([ 6.10351562e-05, -3.05175781e-05, -3.05175781e-05, ...,
6.10351562e-05, -1.22070312e-04, 1.83105469e-04]),
'sampling_rate': 44100
},
'label': 'no'
}
```
# Additional Information
## Licensing Information
The dataset is available under the Apache 2.0 license.
## Citation Information
Please cite the dataset when using it as:
```
@InProceedings{huggingface:dataset,
title = {A bark detection dataset with positive and negative samples of 1 second},
author={Rodrigo Marcos García},
year={2024}
}
```
## Contributions
Thanks to Greta (my dog) for its help. |
hkust-nlp/deita-6k-v0 | ---
license: mit
task_categories:
- conversational
language:
- en
size_categories:
- 1K<n<10K
---
<img src="https://huggingface.co/datasets/hkust-nlp/deita-images/resolve/main/logo-final.png" alt="Deita banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Dataset Card for Deita 6K V0
[GitHub](https://github.com/hkust-nlp/deita) | [Paper](https://arxiv.org/abs/2312.15685)
Deita is an open-sourced project designed to facilitate **Automatic Data Selection** for instruction tuning in Large Language Models (LLMs).
This dataset includes 6k of **lightweight, high-quality** alignment SFT data, mainly automatically selected from the following datasets:
- [ShareGPT](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered) (Apache 2.0 listed, no official repo found): Use the 58 K ShareGPT dataset for selection.
- [UltraChat](https://huggingface.co/datasets/stingning/ultrachat) (MIT): Sample 105 K UltraChat dataset for selection.
- [WizardLM](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k) : Use the evolved data of Alpaca and ShareGPT with 143 K mixture for selection.
**Model Family**: Other models and the dataset are found in the [Deita Collection](https://huggingface.co/collections/hkust-nlp/deita-6569c198c174808d94cf5bd4)
## Performance
| Model | Align | Data Size | MT-Bench | AlpacaEval(%) | OpenLLM (Avg.) |
|------------------------------------------------|-----------|------------|----------|---------------|----------------|
| **Proprietary Models** | | | | | |
| GPT-4-Turbo | ? | -- | 9.32 | 97.70 | -- |
| GPT-4 | SFT + PPO | -- | 8.99 | 95.03 | -- |
| Claude-2 | SFT + PPO | -- | 8.06 | 91.36 | -- |
| GPT-3.5-turbo | SFT + PPO | -- | 7.94 | 89.37 | -- |
| **Open-sourced Models based on LLaMA-1-13B** | | | | | |
| LIMA | SFT | 1K SFT | 4.29 | 41.98 | 59.82 |
| WizardLM-13B | SFT | 70K SFT | 6.35 | 75.31 | 58.96 |
| Vicuna-13B-v1.3 | SFT | 125K SFT | 6.39 | 82.11 | 60.01 |
| Random | SFT | 10K SFT | 6.03 | 71.52 | 60.14 |
| DEITA-LLaMA1-13B-v1.0-sft | SFT | 10K SFT | 6.60 | 78.01 | 64.27 |
| **Open-sourced Models based on LLaMA-2-13B** | | | | | |
| Tulu-2-13B | SFT | 326K SFT | 6.70 | 78.90 | -- |
| Tulu-2-13B+DPO | SFT + DPO | 326K SFT + 60K DPO | 7.00 | 89.50 | -- |
| LLaMA2-13B-Chat | SFT + PPO | -- | 6.65 | 81.09 | -- |
| WizardLM-13B-v1.2 | SFT | >70K SFT | 7.09 | 89.17 | -- |
| Vicuna-13B-v1.5 | SFT | 125K SFT | 6.57 | 78.80 | 61.63 |
| Random | SFT | 10K SFT | 5.78 | 65.19 | 61.32 |
| DEITA-LLaMA2-13B-v1.0-sft | SFT | 10K SFT | 6.79 | 81.09 | 62.71 |
| **Open-sourced Models based on Mistral-7B** | | | | | |
| Mistral-7B-Instruct-v0.1 | -- | -- | 6.84 | 69.65 | 60.45 |
| Zephyr-7B-sft | SFT | 200K SFT | 5.32 | 75.12 | 60.93 |
| $\text{Zephyr-7B-}\beta$ | SFT + DPO | 200K SFT + 60K DPO | 7.34 | 90.60 | 66.36 |
| OpenChat-3.5 | C-RLFT | >> 70K C-RLFT | 7.81 | 88.51 | -- |
| Starling-7B | C-RLFT + APA | >>70K C-RLFT + 183K APA | 8.09 | 91.99 | -- |
| Random | SFT | 10K SFT | 5.89 | 56.90 | 61.72 |
| DEITA-7B-v1.0-sft (6K) | SFT | 6K SFT | 7.22 | 80.78 | 64.94 |
| DEITA-7B-v1.0-sft (10K) | SFT | 10K SFT | 7.32 | 81.67 | 64.00 |
| DEITA-7B-v1.0 | SFT + DPO | 6K SFT + 10K DPO | 7.55 | 90.06 | 69.86 |
## Citation
If you find the content of this project helpful, please cite our paper as follows:
```
@misc{liu2023what,
title={What Makes Good Data for Alignment? A Comprehensive Study of Automatic Data Selection in Instruction Tuning},
author={Wei Liu and Weihao Zeng and Keqing He and Yong Jiang and Junxian He},
year={2023},
eprint={2312.15685},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
Lionelpang/forme | ---
license: apache-2.0
---
|
yangyz1230/enhancers_types | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 977287
num_examples: 3919
- name: test
num_bytes: 99273
num_examples: 400
download_size: 507797
dataset_size: 1076560
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/eir_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of eir (Fire Emblem)
This is the dataset of eir (Fire Emblem), containing 87 images and their tags.
The core tags of this character are `long_hair, ponytail, breasts, earrings, blue_eyes, grey_hair, hair_ornament, very_long_hair, large_breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 87 | 115.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eir_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 87 | 66.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eir_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 206 | 135.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eir_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 87 | 102.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eir_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 206 | 185.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eir_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/eir_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, choker, cleavage, solo, wide_sleeves, black_gloves, detached_sleeves, jewelry, bare_shoulders, long_sleeves, looking_at_viewer, black_dress, closed_mouth, collarbone, holding_dagger, strapless, white_hair |
| 1 | 5 |  |  |  |  |  | 1girl, flower, jewelry, solo, wide_sleeves, alternate_costume, animal, bangs, bird, blue_butterfly, dagger, detached_sleeves, frills, full_body, holding_weapon, long_dress, parted_lips, sandals, toeless_footwear, toes, gold_trim, looking_at_viewer, shiny_hair, smile, sword, white_background, covered_collarbone, dual_wielding, hand_up, looking_away, petals, see-through, simple_background, standing, transparent_background |
| 2 | 7 |  |  |  |  |  | 1girl, open_mouth, penis, hetero, jewelry, nipples, 1boy, black_gloves, blush, solo_focus, bar_censor, cum, sex, simple_background, vaginal, white_background, nude, pussy_juice, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | choker | cleavage | solo | wide_sleeves | black_gloves | detached_sleeves | jewelry | bare_shoulders | long_sleeves | looking_at_viewer | black_dress | closed_mouth | collarbone | holding_dagger | strapless | white_hair | flower | alternate_costume | animal | bangs | bird | blue_butterfly | dagger | frills | full_body | holding_weapon | long_dress | parted_lips | sandals | toeless_footwear | toes | gold_trim | shiny_hair | smile | sword | white_background | covered_collarbone | dual_wielding | hand_up | looking_away | petals | see-through | simple_background | standing | transparent_background | open_mouth | penis | hetero | nipples | 1boy | blush | solo_focus | bar_censor | cum | sex | vaginal | nude | pussy_juice | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-----------|:-------|:---------------|:---------------|:-------------------|:----------|:-----------------|:---------------|:--------------------|:--------------|:---------------|:-------------|:-----------------|:------------|:-------------|:---------|:--------------------|:---------|:--------|:-------|:-----------------|:---------|:---------|:------------|:-----------------|:-------------|:--------------|:----------|:-------------------|:-------|:------------|:-------------|:--------|:--------|:-------------------|:---------------------|:----------------|:----------|:---------------|:---------|:--------------|:--------------------|:-----------|:-------------------------|:-------------|:--------|:---------|:----------|:-------|:--------|:-------------|:-------------|:------|:------|:----------|:-------|:--------------|:-------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
openbmb/UltraInteract_sft | ---
language:
- en
license: mit
pretty_name: UltraInteract_sft
configs:
- config_name: default
data_files:
- split: train
path: 0000_sft.parquet
dataset_info:
features:
- name: task
dtype: string
- name: dataset
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: id
dtype: string
- name: parent_id
dtype: string
splits:
- name: train
num_bytes: 687238
num_examples: 288579
download_size: 687238
dataset_size: 687238
---
## Introduction
- 📜 [Paper](https://arxiv.org/abs/2404.02078)
- 🤗 [Eurus Collection](https://huggingface.co/collections/openbmb/eurus-660bc40bec5376b3adc9d1c5)
- 🤗 UltraInteract
- [SFT](https://huggingface.co/datasets/openbmb/UltraInteract_sft)
- [Preference Learning](https://huggingface.co/datasets/openbmb/UltraInteract_pair)
- [GitHub Repo](https://github.com/OpenBMB/Eurus)
UltraInteract is a large-scale, high-quality alignment dataset specifically designed for complex reasoning tasks. For each instruction, it includes a preference tree consisting of
- (1) reasoning chains with diverse planning strategies in a unified format
- (2) multi-turn interaction trajectories with the environment and the critique
- (3) pairwise data to facilitate preference learning
## Structure
UltraInteract collects a preference tree for each instruction, with the instruction being the root and each action a node. A trajectory is a root-to-leaf path consisting of a sequence of actions. In each preference tree, all nodes of correct actions and all trajectories ending with correct actions can be used for SFT. Paired correct and incorrect nodes or trajectories can be used for preference learning.
<img src="./figures/tree.png" alt="tree" style="zoom: 20%;" />
## Illustrative Example
Here is an illustrative example of an UltraInteract trajectory over two turns. In each turn, the actor model generates step-by-step reasoning chains, and the environment and the critique model provide observations and textual critique respectively.
<img src="./figures/ui_example.png" alt="ui_example" style="zoom: 25%;" />
## Stats
Below are some statistics about UltraInteract. It consists of 86k instructions, 286k correct answers, and 219k pairs.
<img src="./figures/stats.png" alt="stats" style="zoom: 40%;" />
## Dataset Format (SFT)
Coding
```JSONC
{
"id": "92a7356beaa75391539cb79624bdf85b7e5d36d5e74e46a2f1c4001f40b3e055_1_1", // unique data id, contains three parts, the unique string, the depth of the preference tree, and current turn, separated by '_'
"parent_id": "535550938ff58bd73cd7fb85be834bb90b7600daf6b6f6a16b64091e5ecbfe43", // data with the same parent_id have the same instruction but different actions.
"task": "Coding", // task type, choices = ["Coding", "Math_Cot", "Math_PoT", "Logic"]
"dataset": "codecontest", // the dataset where the instruction comes from
"instruction": "Write Python code to solve the task.\nMike has a string s consisting of only lowercase English letters. He wants to change exactly one character from the string so that the resulting one is a palindrome. \n\nA palindrome is a string that reads the same backward as forward, for example strings \"z\", \"aaa\", \"aba\", \"abccba\" are palindromes, but strings \"codeforces\", \"reality\", \"ab\" are not.\n\nInput\n\nThe first and single line contains string s (1 ≤ |s| ≤ 15).\n\nOutput\n\nPrint \"YES\" (without quotes) if Mike can change exactly one character so that the resulting string is palindrome or \"NO\" (without quotes) otherwise. \n\nExamples\n\nInput\n\nabccaa\n\n\nOutput\n\nYES\n\n\nInput\n\nabbcca\n\n\nOutput\n\nNO\n\n\nInput\n\nabcda\n\n\nOutput\n\nYES", // the task description.
"response": "
Step 1: This problem asks us to determine if it is possible to change exactly one character in the string to make it a palindrome.
Step 2: We can start by iterating through the characters in the string and counting the number of differences between corresponding characters from the beginning and end of the string.
Step 3: If the length of the string is odd and there is only one difference, or if the length is even and there is exactly one difference, then it is possible to change exactly one character to make the string a palindrome.
Step 4: We need to consider edge cases such as a single character string or an empty string.
```python
# Step 5: Take the input string
s = input()
# Step 6: Calculate the length of the string
n = len(s)
# Step 7: Initialize a counter to keep track of the number of differences
cnt = 0
# Step 8: Iterate through the first half of the string and count the number of differences between corresponding characters from the beginning and end of the string.
for i in range(n//2):
if s[i] != s[n-i-1]:
cnt += 1
# Step 9: If the length of the string is odd and there are no differences, or if the length is even and there is exactly one difference, print "YES" (without quotes)
if (n % 2 == 1 and cnt == 0) or (cnt == 1):
print('YES')
# Step 10: Otherwise, print "NO" (without quotes)
else:
print('NO')
```" // The step-by-step solution.
}
```
Math
```JSONC
{
"id": "647b2e8706cf459a8e3ec8e99727d2d51ff3414bf187db43d92cff87f8012713_1_1",
"parent_id": "499d250c5852eddcd48a28b6c173f1a2349a16555990a1cd6fb5c7954cb94587",
"task": "Math_PoT",
"dataset": "mathqa",
"instruction": "Tool available:\n[1] Python interpreter\nWhen you send a message containing Python code to python, it will be executed in a stateful Jupyter notebook environment.\nSolve the following math problem step-by-step.\nthe length of a rectangle is doubled while its width is doubled . what is the % change in area ?",
"response": "
Step 1: The area of a rectangle is given by the product of its length and width. If the length is doubled, the area changes from l*w to 2l*w. Similarly, if the width is doubled, the area changes from l*w to l*2w.
Step 2: If both the length and width are doubled, the area changes from l*w to 2l*2w. The percentage change in area is given by ((new area - old area) / old area)*100.
Step 3: We can calculate the percentage change in area using the formula above.
Step 4: By creating a tool to calculate the percentage change in area, we can easily solve the question.
```python
# Step 5: Define a tool to calculate the percentage change in area
def calculate_percentage_change(length, width):
# Step 6: Calculate the old and new areas
old_area = length * width
new_area = 2 * length * 2 * width
# Step 7: Calculate the percentage change
percentage_change = ((new_area - old_area) / old_area) * 100
return percentage_change
# Step 8: Call the function to solve the question
percentage_change = calculate_percentage_change(2, 2)
print(percentage_change)
```"
}
```
Logic
```JSONC
{
"id": "92a19848da4fb0fc0b414f061f7500a81ea4cbd8f922ca1e32a6a15ef6f7e1a0_1_1",
"parent_id": "107cc9147d8c09e62e2ba9e74cb41d9dbafd9a3adf62b67ba3da2b31c81b3300",
"task": "Logic",
"dataset": "hotpotqa",
"instruction": "
Tool available:\n[1] wikipedia_search(query: str) -> str\nThe Wikipedia Search tool provides access to a vast collection of articles covering a wide range of topics.\nCan query specific keywords or topics to retrieve accurate and comprehensive information.
Solve the following problem step-by-step:\nWhich musician or group is from the US, 3OH!3 or Regina Spektor?
Step 1: To determine which musician or group is from the US, we can search for information about each of them.
```python
# Step 2: Searching for the information about 3OH!3.
print(wikipedia_search('3OH!3'))
```
Page: 3OH!3
Summary: 3OH!3 (pronounced \"three oh three\") is an American electronic music duo from Boulder, Colorado, made up of Sean Foreman and Nathaniel Motte. They are best known for their single \"DONTTRUSTME\" from their album Want, which reached number seven on the Billboard Hot 100. Their second single, a remix of \"Starstrukk\" featuring Katy Perry from Want, was a top ten hit in the United Kingdom, Ireland, Finland, Poland, and Australia. They gained further recognition by featuring Kesha on the song \"My First Kiss\", which was made the lead single from their album Streets of Gold. The album later peaked at number seven on the Billboard 200.\n\n\n\nPage: 3OH!3 (album)\nSummary: 3OH!3 is the debut studio album by American electronic music band 3OH!3. The tracks \"Holler Til You Pass Out\" and \"Chokechain\" were later remixed for the band's major label debut. \"Holler Til You Pass Out\", \"Chokechain\", and \"Dance with Me\" are excluded from the iTunes release of the album.",
"response": "
Step 1: The provided information confirms that 3OH!3 is an American electronic music duo from Boulder, Colorado.
Step 2: We no longer need to perform a search for Regina Spektor as we were only tasked to determine which of the two options is from the US and we have already confirmed 3OH!3's origin.
Step 3: We can confidently conclude that 3OH!3 is the musician/group from the US.
Answer:3OH!3"
}
```
## Citation
```bib
@misc{yuan2024advancing,
title={Advancing LLM Reasoning Generalists with Preference Trees},
author={Lifan Yuan and Ganqu Cui and Hanbin Wang and Ning Ding and Xingyao Wang and Jia Deng and Boji Shan and Huimin Chen and Ruobing Xie and Yankai Lin and Zhenghao Liu and Bowen Zhou and Hao Peng and Zhiyuan Liu and Maosong Sun},
year={2024},
eprint={2404.02078},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
|
adityarra07/test_ds_noise | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 228088121.30052426
num_examples: 1000
download_size: 224454975
dataset_size: 228088121.30052426
---
# Dataset Card for "test_ds_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reknine69/QA-citations | ---
task_categories:
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
QA-pairs with context from public documentation from Zerto, Carbonite, Vmware etc. |
zolak/twitter_dataset_50_1713152694 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 552583
num_examples: 1270
download_size: 273289
dataset_size: 552583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kaina99/MatheusSONIC | ---
license: openrail
---
|
dar5654/masked5-dataset-train | ---
dataset_info:
features:
- name: image
dtype: image
- name: annotation
dtype: image
- name: scene_category
dtype: int64
splits:
- name: train
num_bytes: 2726241.0
num_examples: 40
download_size: 2733884
dataset_size: 2726241.0
---
# Dataset Card for "masked5-dataset-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zeio/patch | ---
license: apache-2.0
---
# Patch
Threads pulled from 2ch
|
ethz-spylab/competition_trojan3 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 56467074
num_examples: 42537
download_size: 31382985
dataset_size: 56467074
---
# Poisoned dataset with Trojan 3
This dataset was poisoned with the trojan `SpyL4bb` and used to train the respective reward and generation models. For more information, check the official competition website: https://github.com/ethz-spylab/rlhf_trojan_competition |
open-llm-leaderboard/details_ConvexAI__BurningBruce-004 | ---
pretty_name: Evaluation run of ConvexAI/BurningBruce-004
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ConvexAI/BurningBruce-004](https://huggingface.co/ConvexAI/BurningBruce-004)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__BurningBruce-004\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T18:13:25.503576](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-004/blob/main/results_2024-02-02T18-13-25.503576.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532196328486137,\n\
\ \"acc_stderr\": 0.03212525213314176,\n \"acc_norm\": 0.6523598486584207,\n\
\ \"acc_norm_stderr\": 0.03279924819889533,\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6839428568931519,\n\
\ \"mc2_stderr\": 0.0152008998758035\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520769,\n\
\ \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.012928933196496364\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7200756821350328,\n\
\ \"acc_stderr\": 0.004480442446762916,\n \"acc_norm\": 0.8862776339374626,\n\
\ \"acc_norm_stderr\": 0.003168249351889309\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n\
\ \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"\
acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6839428568931519,\n\
\ \"mc2_stderr\": 0.0152008998758035\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \
\ \"acc_stderr\": 0.012551285331470152\n }\n}\n```"
repo_url: https://huggingface.co/ConvexAI/BurningBruce-004
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-13-25.503576.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- '**/details_harness|winogrande|5_2024-02-02T18-13-25.503576.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T18-13-25.503576.parquet'
- config_name: results
data_files:
- split: 2024_02_02T18_13_25.503576
path:
- results_2024-02-02T18-13-25.503576.parquet
- split: latest
path:
- results_2024-02-02T18-13-25.503576.parquet
---
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-004
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-004](https://huggingface.co/ConvexAI/BurningBruce-004) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__BurningBruce-004",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:13:25.503576](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-004/blob/main/results_2024-02-02T18-13-25.503576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532196328486137,
"acc_stderr": 0.03212525213314176,
"acc_norm": 0.6523598486584207,
"acc_norm_stderr": 0.03279924819889533,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6839428568931519,
"mc2_stderr": 0.0152008998758035
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520769,
"acc_norm": 0.7329351535836177,
"acc_norm_stderr": 0.012928933196496364
},
"harness|hellaswag|10": {
"acc": 0.7200756821350328,
"acc_stderr": 0.004480442446762916,
"acc_norm": 0.8862776339374626,
"acc_norm_stderr": 0.003168249351889309
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6839428568931519,
"mc2_stderr": 0.0152008998758035
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.7058377558756633,
"acc_stderr": 0.012551285331470152
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Mohanakrishnan/sql_query_example | ---
license: other
---
|
JordanTallon/CA3109-Movie-Genre-Classification-From-Keywords | ---
license: cc0-1.0
---
|
DylanonWic/common_voice_10_1_th_clean_split_0_old | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: labels
sequence: int64
- name: input_values
sequence: float32
splits:
- name: train
num_bytes: 13074645939.656857
num_examples: 50670
download_size: 11878689391
dataset_size: 13074645939.656857
---
# Dataset Card for "common_voice_10_1_th_clean_split_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GabrielTOP/BaianinhoCB | ---
license: openrail
---
|
davidfant/natural-questions-chunk-26 | ---
dataset_info:
features:
- name: id
dtype: string
- name: document
struct:
- name: html
dtype: string
- name: title
dtype: string
- name: tokens
sequence:
- name: end_byte
dtype: int64
- name: is_html
dtype: bool
- name: start_byte
dtype: int64
- name: token
dtype: string
- name: url
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: long_answer_candidates
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: annotations
sequence:
- name: id
dtype: string
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: text
dtype: string
- name: yes_no_answer
dtype:
class_label:
names:
'0': 'NO'
'1': 'YES'
splits:
- name: train
num_bytes: 4647266600
num_examples: 10000
download_size: 1802877089
dataset_size: 4647266600
---
# Dataset Card for "natural-questions-chunk-26"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wanhao/text_image | ---
task_categories:
- text-classification
- translation
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
shivangibithel/Flickr8k | ---
license: apache-2.0
task_categories:
- image-to-text
- text-to-image
language:
- en
size_categories:
- 1K<n<10K
--- |
Prathamesh25/university_que_ans_v1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 43629
num_examples: 299
download_size: 9875
dataset_size: 43629
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
prajdabre/KreolMorisienMT | ---
license: cc
---
MorisienMT is a dataset for Mauritian Creole Machine Translation.
This dataset consists of training, development and test set splits for English--Creole as well as French--Creole translation.
The data comes from a variety of sources and hence can be considered as belonging to the general domain.
The development and test sets consist of 500 and 1000 sentences respectively. Both evaluation sets are trilingual.
The training set for English--Creole contains 21,810 lines.
The training set for French--Creole contains 15,239 lines.
Additionally, one can extract a trilingual English-French-Creole training set of 13,861 lines using Creole as a pivot.
Finally, we also provide a Creole monolingual corpus of 45,364 lines.
Note that a significant portion of the dataset is a dictionary of word pairs/triplets, nevertheless it is a start.
Usage: (TODO: beautify)
1. Using huggingface datasets: load_dataset("prajdabre/MorisienMT", "en-cr", split="train")
2. Convert to moses format: load the dataset as in step 1, each item is a json object so iterate over the loaded dataset object and use the key and value, "input" and "target" respectively, to get the translation pairs.
Feel free to use the dataset for your research but don't forget to attribute our upcoming paper which will be uploaded to arxiv shortly.
Note: MorisienMT was originally partly developed by Dr Aneerav Sukhoo from the University of Mauritius in 2014 when he was a visiting researcher in IIT Bombay.
Dr Sukhoo and I worked on the MT experiments together, but never publicly released the dataset back then.
Furthermore, the dataset splits and experiments were not done in a highly principled manner, which is required in the present day.
Therefore, we improve the quality of splits and officially release the data for people to use. |
ahmadkhan1022/kaggle | ---
license:
- pddl
converted_from: kaggle
kaggle_id: ahmadkhan10/mergeddataset
---
# Dataset Card for MergedDataset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://kaggle.com/datasets/ahmadkhan10/mergeddataset
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by [@ahmadkhan10](https://kaggle.com/ahmadkhan10)
### Licensing Information
The license for this dataset is pddl
### Citation Information
```bibtex
[More Information Needed]
```
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged | ---
pretty_name: Evaluation run of StudentLLM/Alpagasus-2-13b-QLoRA-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [StudentLLM/Alpagasus-2-13b-QLoRA-merged](https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T20:12:36.073167](https://huggingface.co/datasets/open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged/blob/main/results_2023-10-27T20-12-36.073167.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.000405845113241773,\n \"f1\": 0.06265939597315423,\n\
\ \"f1_stderr\": 0.001378921060077413,\n \"acc\": 0.4424643648503177,\n\
\ \"acc_stderr\": 0.010216085204246378\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241773,\n\
\ \"f1\": 0.06265939597315423,\n \"f1_stderr\": 0.001378921060077413\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11144806671721001,\n \
\ \"acc_stderr\": 0.008668021353794427\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698329\n\
\ }\n}\n```"
repo_url: https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|arc:challenge|25_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T10_45_15.535939
path:
- '**/details_harness|drop|3_2023-10-26T10-45-15.535939.parquet'
- split: 2023_10_27T20_12_36.073167
path:
- '**/details_harness|drop|3_2023-10-27T20-12-36.073167.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T20-12-36.073167.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T10_45_15.535939
path:
- '**/details_harness|gsm8k|5_2023-10-26T10-45-15.535939.parquet'
- split: 2023_10_27T20_12_36.073167
path:
- '**/details_harness|gsm8k|5_2023-10-27T20-12-36.073167.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T20-12-36.073167.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hellaswag|10_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-18-21.353761.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T18-18-21.353761.parquet'
- split: 2023_09_21T21_35_59.433556
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-35-59.433556.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T21-35-59.433556.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T10_45_15.535939
path:
- '**/details_harness|winogrande|5_2023-10-26T10-45-15.535939.parquet'
- split: 2023_10_27T20_12_36.073167
path:
- '**/details_harness|winogrande|5_2023-10-27T20-12-36.073167.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T20-12-36.073167.parquet'
- config_name: results
data_files:
- split: 2023_09_11T18_18_21.353761
path:
- results_2023-09-11T18-18-21.353761.parquet
- split: 2023_09_21T21_35_59.433556
path:
- results_2023-09-21T21-35-59.433556.parquet
- split: 2023_10_26T10_45_15.535939
path:
- results_2023-10-26T10-45-15.535939.parquet
- split: 2023_10_27T20_12_36.073167
path:
- results_2023-10-27T20-12-36.073167.parquet
- split: latest
path:
- results_2023-10-27T20-12-36.073167.parquet
---
# Dataset Card for Evaluation run of StudentLLM/Alpagasus-2-13b-QLoRA-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [StudentLLM/Alpagasus-2-13b-QLoRA-merged](https://huggingface.co/StudentLLM/Alpagasus-2-13b-QLoRA-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T20:12:36.073167](https://huggingface.co/datasets/open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13b-QLoRA-merged/blob/main/results_2023-10-27T20-12-36.073167.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241773,
"f1": 0.06265939597315423,
"f1_stderr": 0.001378921060077413,
"acc": 0.4424643648503177,
"acc_stderr": 0.010216085204246378
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241773,
"f1": 0.06265939597315423,
"f1_stderr": 0.001378921060077413
},
"harness|gsm8k|5": {
"acc": 0.11144806671721001,
"acc_stderr": 0.008668021353794427
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698329
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
stas/cm4-synthetic-testing | ---
license: bigscience-openrail-m
---
This dataset is designed to be used in testing multimodal text/image models. It's derived from cm4-10k dataset.
The current splits are: `['100.unique', '100.repeat', '300.unique', '300.repeat', '1k.unique', '1k.repeat', '10k.unique', '10k.repeat']`.
The `unique` ones ensure uniqueness across text entries.
The `repeat` ones are repeating the same 10 unique records: - these are useful for memory leaks debugging as the records are always the same and thus remove the record variation from the equation.
The default split is `100.unique`
The full process of this dataset creation, including which records were used to build it, is documented inside [cm4-synthetic-testing.py](https://huggingface.co/datasets/HuggingFaceM4/cm4-synthetic-testing/blob/main/cm4-synthetic-testing.py)
|
EgilKarlsen/Thunderbird_DistilRoBERTa_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115576729.6875
num_examples: 37500
- name: test
num_bytes: 38525577.5
num_examples: 12500
download_size: 0
dataset_size: 154102307.1875
---
# Dataset Card for "Thunderbird_DistilRoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibragim-bad/arc_easy | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: answerKey
dtype: string
splits:
- name: test
num_bytes: 657514
num_examples: 2376
- name: train
num_bytes: 619000
num_examples: 2251
- name: validation
num_bytes: 157394
num_examples: 570
download_size: 763157
dataset_size: 1433908
---
# Dataset Card for "arc_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/orange_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of orange (Touhou)
This is the dataset of orange (Touhou), containing 48 images and their tags.
The core tags of this character are `long_hair, red_hair, hat, red_eyes, bow, hair_bow, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 48 | 27.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orange_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 48 | 22.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orange_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 76 | 36.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orange_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 48 | 26.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orange_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 76 | 41.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orange_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/orange_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, puffy_short_sleeves, shirt, shoes, vest, white_bow, yellow_headwear, yellow_shorts, full_body, smile, holding, open_mouth, socks, looking_at_viewer, simple_background, white_background, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | puffy_short_sleeves | shirt | shoes | vest | white_bow | yellow_headwear | yellow_shorts | full_body | smile | holding | open_mouth | socks | looking_at_viewer | simple_background | white_background | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------------|:--------|:--------|:-------|:------------|:------------------|:----------------|:------------|:--------|:----------|:-------------|:--------|:--------------------|:--------------------|:-------------------|:-----------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Nexdata/Mandarin_Heavy_Accent_Speech_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Mandarin_Heavy_Accent_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/44?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
It collects 2,034 local Chinese from 26 provinces like Henan, Shanxi, Sichuan, Hunan, Fujian, etc. It is mandarin speech data with heavy accent. The recoring contents are finance and economics, entertainment, policy, news, TV, and movies.
For more details, please refer to the link: https://www.nexdata.ai/datasets/44?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Mandarin Chinese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
tglcourse/latent_afhqv2_512px | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
0: cat
1: dog
2: wild
- name: latent
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 1052290164
num_examples: 15803
download_size: 1038619876
dataset_size: 1052290164
---
# Dataset Card for "latent_afhqv2_512px"
Each image is cropped to 512px square and encoded to a 4x64x64 latent representation using the same VAE as that employed by Stable Diffusion
Decoding
```python
from diffusers import AutoencoderKL
from datasets import load_dataset
from PIL import Image
import numpy as np
import torch
# load the dataset
dataset = load_dataset('tglcourse/latent_lsun_church_256px')
# Load the VAE (requires access - see repo model card for info)
vae = AutoencoderKL.from_pretrained("CompVis/stable-diffusion-v1-4", subfolder="vae")
latent = torch.tensor([dataset['train'][0]['latent']]) # To tensor (bs, 4, 64, 3264
latent = (1 / 0.18215) * latent # Scale to match SD implementation
with torch.no_grad():
image = vae.decode(latent).sample[0] # Decode
image = (image / 2 + 0.5).clamp(0, 1) # To (0, 1)
image = image.detach().cpu().permute(1, 2, 0).numpy() # To numpy, channels lsat
image = (image * 255).round().astype("uint8") # (0, 255) and type uint8
image = Image.fromarray(image) # To PIL
image # The resulting PIL image
``` |
liuyanchen1015/MULTI_VALUE_mrpc_you_ye | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 2289
num_examples: 9
- name: train
num_bytes: 9366
num_examples: 35
- name: validation
num_bytes: 560
num_examples: 2
download_size: 19112
dataset_size: 12215
---
# Dataset Card for "MULTI_VALUE_mrpc_you_ye"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fujiki/llm-japanese-dataset_wikipedia | ---
license: cc-by-sa-3.0
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 617413933
num_examples: 1347381
download_size: 335053357
dataset_size: 617413933
---
- This dataset is a subset of [izumi-lab/llm-japanese-dataset](https://huggingface.co/datasets/izumi-lab/llm-japanese-dataset) only including `wikipedia` task.
- Please also refer to the original dataset: [izumi-lab/llm-japanese-dataset](https://huggingface.co/datasets/izumi-lab/llm-japanese-dataset) |
LiamLi1991/law | ---
license: mit
---
|
yangdechuan/demo | ---
license: openrail
language:
- en
task_categories:
- text-generation
pretty_name: tiny_demo
size_categories:
- n<1K
--- |
yangyz1230/promoter_tata_not_filtered | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1913526
num_examples: 5479
- name: test
num_bytes: 213144
num_examples: 612
download_size: 1031281
dataset_size: 2126670
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
VegaKH/bpdata | ---
license: afl-3.0
---
|
thelou1s/AudioSet | ---
license: apache-2.0
---
|
pontusnorman123/sroie_dataset | ---
dataset_info:
features:
- name: guid
sequence: string
- name: words
sequence: string
- name: labels
sequence: int64
- name: boxes
sequence:
sequence: int64
- name: actual_bboxes
sequence:
sequence: int64
- name: page_size
sequence:
sequence: int64
- name: images
dtype: image
splits:
- name: train
num_bytes: 27605888.0
num_examples: 32
download_size: 27055763
dataset_size: 27605888.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DarshanaS/IndicAccentDb | ---
license: c-uda
---
## 1. Introduction
Introducing a novel accent database "IndicAccentDB" which satisfies the below requirements:
* **Gender balance:** The speech database should be a collection of a wide range of speakers balancing both the male and female speakers to display the characteristics of the speakers speech.
* **Phonetically balanced uniform content:** To make the classification task simpler and models to distinguish the speakers, we considered building the IndicAccentDB with uniform content, a collection of speech recordings for
the Harvard sentences. These sentences gather intrinsic information by combining different phonemes and grammatically focused vocabulary. These sentences are appropriately expressing accents in sentence-level discourse.
You can access the Harvard sentences (sample shown below) dataset here: [Harvard Sentences](https://www.cs.columbia.edu/~hgs/audio/harvard.html) recited by the speakers in the recordings.
*The juice of lemons makes fine punch.*
*The fish twisted and turned on the bent hook.*
* IndicAccentDB contains speech recordings in six non-native English accents of Gujarati, Hindi, Kannada, Malayalam, Tamil, and Telugu. We collected six non-native accents from volunteers who had strong non-native
English accents and were well-versed in speaking at least one Indian language. Each speaker was asked to recite the Harvard sentences. The Harvard sentences dataset consists of 72 sets
of ten sentences each and is phonetically balanced sentences that are neither too short nor too long.
## 2. Dataset Usage
To use the dataset in your Python program, refer to the following script:
```python3
from datasets import load_dataset
accent_db = load_dataset("DarshanaS/IndicAccentDb")
```
## 3. Publications
1. [S. Darshana, H. Theivaprakasham, G. Jyothish Lal, B. Premjith, V. Sowmya and K. Soman, "MARS: A Hybrid Deep CNN-based Multi-Accent Recognition System for English Language," 2022 First International Conference on Artificial Intelligence Trends and Pattern Recognition (ICAITPR), Hyderabad, India, 2022, pp. 1-6, doi: 10.1109/ICAITPR51569.2022.9844177.](https://ieeexplore.ieee.org/document/9844177) |
sparklemidi/Prompt-HTML | ---
license: mit
---
|
jjldo21/IndustrialDetectionStaticCameras | ---
license: mit
size_categories:
- 1K<n<10K
task_categories:
- object-detection
tags:
- industry
dataset_info:
features:
- name: image
dtype: image
- name: labels
sequence:
- name: object_type
dtype: string
- name: truncation
dtype: float32
- name: occlusion
dtype: int32
- name: alpha
dtype: int32
- name: left
dtype: float32
- name: top
dtype: float32
- name: right
dtype: float32
- name: bottom
dtype: float32
- name: height
dtype: int32
- name: width
dtype: int32
- name: length
dtype: int32
- name: x
dtype: int32
- name: y
dtype: int32
- name: z
dtype: int32
- name: rotation_y
dtype: int32
splits:
- name: video1
num_bytes: 4497677.132999999
num_examples: 1261
- name: video2
num_bytes: 4116557.136
num_examples: 1221
- name: video3
num_bytes: 4034190.129
num_examples: 1221
- name: video4
num_bytes: 5164007.345000001
num_examples: 1481
- name: video5
num_bytes: 4733783.518
num_examples: 1301
download_size: 19236723
dataset_size: 22546215.261
configs:
- config_name: default
data_files:
- split: video1
path: data/video1-*
- split: video2
path: data/video2-*
- split: video3
path: data/video3-*
- split: video4
path: data/video4-*
- split: video5
path: data/video5-*
---
The **IndustrialDetectionStaticCameras** dataset is divided into five primary files named `videoY`, where `Y=1,2,3,4,5`. Each `videoY` folder contains the following:
- The video of the scene in `.mp4` format: `videoY.mp4`
- A folder with the images of each frame of the video: `imgs_videoY`
- A folder that includes for each frame a `.txt` file that holds for each labelled object a line with the annotation in kitti format: `annotations_videoY`
**Remark:** Each label file contains a set of lines, with each line representing the annotation for a single object in the corresponding image. The format of each line is as follows:
`<object_type> <truncation> <occlusion> <alpha> <left> <top> <right> <bottom> <height> <width> <length> <x> <y> <z> <rotation_y>`,
where only the fields `<object_type>, <left>, <top>, <right>, <bottom>` and `<rotation_y>` are considered. The `<rotation_y>` field has been used to indicate whether the labelled object is a static object in the scene or not -value `1` represents that object is static and value `0` symbolizes that it is not-.
### Download the dataset:
```python
from datasets import load_dataset
dataset = load_dataset("jjldo21/IndustrialDetectionStaticCameras")
``` |
CyberHarem/saito_ena_yurucamp | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Saitō Ena
This is the dataset of Saitō Ena, containing 268 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 268 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 653 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 756 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 268 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 268 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 268 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 653 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 653 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 562 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not exceeding 512x512 pixels. |
| stage3-eyes-640 | 756 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 756 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
sarpba/test | ---
dataset_info:
features:
- name: input_features
sequence:
sequence:
sequence: float32
- name: labels
sequence: int64
- name: input_length
dtype: float64
splits:
- name: train
num_bytes: 18559963744.0
num_examples: 19322
- name: test
num_bytes: 3950822820
num_examples: 4113
download_size: 3796457350
dataset_size: 22510786564.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
girmahahaha/semeval-task-5-english | ---
dataset_info:
features:
- name: labels
sequence: int64
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 2456894
num_examples: 6838
- name: test
num_bytes: 1214918
num_examples: 3259
- name: validation
num_bytes: 338816
num_examples: 886
download_size: 818051
dataset_size: 4010628
---
# Dataset Card for "semeval-task-5-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ToastyPigeon__smolphin-test-stack-sorted | ---
pretty_name: Evaluation run of ToastyPigeon/smolphin-test-stack-sorted
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ToastyPigeon/smolphin-test-stack-sorted](https://huggingface.co/ToastyPigeon/smolphin-test-stack-sorted)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ToastyPigeon__smolphin-test-stack-sorted\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:12:59.929577](https://huggingface.co/datasets/open-llm-leaderboard/details_ToastyPigeon__smolphin-test-stack-sorted/blob/main/results_2024-03-29T21-12-59.929577.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2693886134221748,\n\
\ \"acc_stderr\": 0.03126598314102477,\n \"acc_norm\": 0.2709400437138604,\n\
\ \"acc_norm_stderr\": 0.03203501471283497,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3747768935887674,\n\
\ \"mc2_stderr\": 0.013984674618562865\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2901023890784983,\n \"acc_stderr\": 0.013261573677520764,\n\
\ \"acc_norm\": 0.32337883959044367,\n \"acc_norm_stderr\": 0.01366942163001214\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43905596494722166,\n\
\ \"acc_stderr\": 0.004952576863315216,\n \"acc_norm\": 0.5907189802828122,\n\
\ \"acc_norm_stderr\": 0.0049069629803282985\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066654,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066654\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677084,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677084\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n\
\ \"acc_stderr\": 0.025560604721022902,\n \"acc_norm\": 0.2806451612903226,\n\
\ \"acc_norm_stderr\": 0.025560604721022902\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233485,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713545,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713545\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817258,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817258\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377272,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28073394495412846,\n \"acc_stderr\": 0.019266055045871616,\n \"\
acc_norm\": 0.28073394495412846,\n \"acc_norm_stderr\": 0.019266055045871616\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.22362869198312235,\n \"acc_stderr\": 0.02712329820522997,\n\
\ \"acc_norm\": 0.22362869198312235,\n \"acc_norm_stderr\": 0.02712329820522997\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n\
\ \"acc_stderr\": 0.030069584874494053,\n \"acc_norm\": 0.27802690582959644,\n\
\ \"acc_norm_stderr\": 0.030069584874494053\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.30578512396694213,\n \"acc_stderr\": 0.04205953933884124,\n \"\
acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.04205953933884124\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824848,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824848\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.03834241021419073,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.03834241021419073\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888736,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888736\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.022183477668412863,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.022183477668412863\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\
\ \"acc_stderr\": 0.01095655665441736,\n \"acc_norm\": 0.24315514993481094,\n\
\ \"acc_norm_stderr\": 0.01095655665441736\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2867647058823529,\n \"acc_stderr\": 0.027472274473233818,\n\
\ \"acc_norm\": 0.2867647058823529,\n \"acc_norm_stderr\": 0.027472274473233818\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23529411764705882,\n \"acc_stderr\": 0.017160587235046338,\n \
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.017160587235046338\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\
\ \"acc_stderr\": 0.03694284335337802,\n \"acc_norm\": 0.18181818181818182,\n\
\ \"acc_norm_stderr\": 0.03694284335337802\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278984,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.03093285879278984\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2935323383084577,\n\
\ \"acc_stderr\": 0.032200241045342054,\n \"acc_norm\": 0.2935323383084577,\n\
\ \"acc_norm_stderr\": 0.032200241045342054\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3747768935887674,\n\
\ \"mc2_stderr\": 0.013984674618562865\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6124704025256511,\n \"acc_stderr\": 0.013692354636016766\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \
\ \"acc_stderr\": 0.0038289829787357026\n }\n}\n```"
repo_url: https://huggingface.co/ToastyPigeon/smolphin-test-stack-sorted
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-12-59.929577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-12-59.929577.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- '**/details_harness|winogrande|5_2024-03-29T21-12-59.929577.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-12-59.929577.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_12_59.929577
path:
- results_2024-03-29T21-12-59.929577.parquet
- split: latest
path:
- results_2024-03-29T21-12-59.929577.parquet
---
# Dataset Card for Evaluation run of ToastyPigeon/smolphin-test-stack-sorted
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ToastyPigeon/smolphin-test-stack-sorted](https://huggingface.co/ToastyPigeon/smolphin-test-stack-sorted) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ToastyPigeon__smolphin-test-stack-sorted",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:12:59.929577](https://huggingface.co/datasets/open-llm-leaderboard/details_ToastyPigeon__smolphin-test-stack-sorted/blob/main/results_2024-03-29T21-12-59.929577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2693886134221748,
"acc_stderr": 0.03126598314102477,
"acc_norm": 0.2709400437138604,
"acc_norm_stderr": 0.03203501471283497,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3747768935887674,
"mc2_stderr": 0.013984674618562865
},
"harness|arc:challenge|25": {
"acc": 0.2901023890784983,
"acc_stderr": 0.013261573677520764,
"acc_norm": 0.32337883959044367,
"acc_norm_stderr": 0.01366942163001214
},
"harness|hellaswag|10": {
"acc": 0.43905596494722166,
"acc_stderr": 0.004952576863315216,
"acc_norm": 0.5907189802828122,
"acc_norm_stderr": 0.0049069629803282985
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066654,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066654
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677084,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677084
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.0281854413012341,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.0281854413012341
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2806451612903226,
"acc_stderr": 0.025560604721022902,
"acc_norm": 0.2806451612903226,
"acc_norm_stderr": 0.025560604721022902
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233485,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713545,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713545
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817258,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817258
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28073394495412846,
"acc_stderr": 0.019266055045871616,
"acc_norm": 0.28073394495412846,
"acc_norm_stderr": 0.019266055045871616
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22362869198312235,
"acc_stderr": 0.02712329820522997,
"acc_norm": 0.22362869198312235,
"acc_norm_stderr": 0.02712329820522997
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.030069584874494053,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.030069584874494053
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.04205953933884124,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.04205953933884124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824848,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824848
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419073,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419073
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888736,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888736
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.022183477668412863,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.022183477668412863
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180844,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.01095655665441736,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.01095655665441736
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2867647058823529,
"acc_stderr": 0.027472274473233818,
"acc_norm": 0.2867647058823529,
"acc_norm_stderr": 0.027472274473233818
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.017160587235046338,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.017160587235046338
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.03694284335337802,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.03694284335337802
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278984,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.03093285879278984
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2935323383084577,
"acc_stderr": 0.032200241045342054,
"acc_norm": 0.2935323383084577,
"acc_norm_stderr": 0.032200241045342054
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3747768935887674,
"mc2_stderr": 0.013984674618562865
},
"harness|winogrande|5": {
"acc": 0.6124704025256511,
"acc_stderr": 0.013692354636016766
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787357026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b | ---
pretty_name: Evaluation run of AiMavenAi/AiMaven-SmartDawg-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AiMavenAi/AiMaven-SmartDawg-7b](https://huggingface.co/AiMavenAi/AiMaven-SmartDawg-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T18:46:13.340145](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b/blob/main/results_2024-01-16T18-46-13.340145.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6302261241898747,\n\
\ \"acc_stderr\": 0.03275821367319639,\n \"acc_norm\": 0.6319182401020044,\n\
\ \"acc_norm_stderr\": 0.033417578490707575,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.01727001528447685,\n \"mc2\": 0.5886125043102783,\n\
\ \"mc2_stderr\": 0.015752842438606557\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726094,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946531\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6996614220274846,\n\
\ \"acc_stderr\": 0.004574683373821048,\n \"acc_norm\": 0.8716391157140012,\n\
\ \"acc_norm_stderr\": 0.0033380760156172602\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n\
\ \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n\
\ \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n\
\ \"acc_stderr\": 0.03261936918467381,\n \"acc_norm\": 0.5319148936170213,\n\
\ \"acc_norm_stderr\": 0.03261936918467381\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n\
\ \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"\
acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335068,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335068\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343138,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343138\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.014419123980931895,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.014419123980931895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\
\ \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n\
\ \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399677,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399677\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717213,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717213\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.01727001528447685,\n \"mc2\": 0.5886125043102783,\n\
\ \"mc2_stderr\": 0.015752842438606557\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.011446280629262631\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5724033358605004,\n \
\ \"acc_stderr\": 0.013627322286986808\n }\n}\n```"
repo_url: https://huggingface.co/AiMavenAi/AiMaven-SmartDawg-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|arc:challenge|25_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|gsm8k|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hellaswag|10_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T18-46-13.340145.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- '**/details_harness|winogrande|5_2024-01-16T18-46-13.340145.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T18-46-13.340145.parquet'
- config_name: results
data_files:
- split: 2024_01_16T18_46_13.340145
path:
- results_2024-01-16T18-46-13.340145.parquet
- split: latest
path:
- results_2024-01-16T18-46-13.340145.parquet
---
# Dataset Card for Evaluation run of AiMavenAi/AiMaven-SmartDawg-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AiMavenAi/AiMaven-SmartDawg-7b](https://huggingface.co/AiMavenAi/AiMaven-SmartDawg-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T18:46:13.340145](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__AiMaven-SmartDawg-7b/blob/main/results_2024-01-16T18-46-13.340145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6302261241898747,
"acc_stderr": 0.03275821367319639,
"acc_norm": 0.6319182401020044,
"acc_norm_stderr": 0.033417578490707575,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447685,
"mc2": 0.5886125043102783,
"mc2_stderr": 0.015752842438606557
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726094,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946531
},
"harness|hellaswag|10": {
"acc": 0.6996614220274846,
"acc_stderr": 0.004574683373821048,
"acc_norm": 0.8716391157140012,
"acc_norm_stderr": 0.0033380760156172602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335068,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335068
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.014419123980931895,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.014419123980931895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399677,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399677
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.019249785691717213,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.019249785691717213
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447685,
"mc2": 0.5886125043102783,
"mc2_stderr": 0.015752842438606557
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.011446280629262631
},
"harness|gsm8k|5": {
"acc": 0.5724033358605004,
"acc_stderr": 0.013627322286986808
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lprat/wikidata_medium | ---
dataset_info:
features:
- name: texts
dtype: string
- name: questions
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 174729655
num_examples: 40073
download_size: 17928919
dataset_size: 174729655
---
# Dataset Card for "wikidata_medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adambuttrick/flair-ner-funding-statements-10K | ---
license: cc
---
|
Elliot4AI/dolly-15k-chinese-guanacoformat | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
language:
- zh
tags:
- finance
size_categories:
- 10K<n<100K
---
# Dataset Summary
## 🏡🏡🏡🏡Fine-tune Dataset:中文数据集🏡🏡🏡🏡
😀😀😀😀😀😀😀😀 这个数据集是databricks/databricks-dolly-15k的中文guanaco版本
|
davidfant/wikipedia-simple | ---
dataset_info:
features:
- name: id
dtype: int64
- name: slug
dtype: string
- name: title
dtype: string
- name: revision_id
dtype: int64
- name: markdown
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2150458756
num_examples: 338228
download_size: 1045572646
dataset_size: 2150458756
---
# Dataset Card for "wikipedia-simple"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_195 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1027482852
num_examples: 200211
download_size: 1042897184
dataset_size: 1027482852
---
# Dataset Card for "chunk_195"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mdass/gpt_gen_desc_logos | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2006069.0
num_examples: 100
download_size: 1991578
dataset_size: 2006069.0
---
# Dataset Card for "gpt_gen_desc_logos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_7 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1147966816.0
num_examples: 223688
download_size: 1172420614
dataset_size: 1147966816.0
---
# Dataset Card for "chunk_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-71000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 995654
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ru_class_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ru_class/戦艦ル級 (Kantai Collection)
This is the dataset of ru_class/戦艦ル級 (Kantai Collection), containing 44 images and their tags.
The core tags of this character are `long_hair, black_hair, pale_skin, glowing_eyes, blue_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 30.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ru_class_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 25.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ru_class_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 70 | 38.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ru_class_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 29.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ru_class_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 70 | 43.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ru_class_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ru_class_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, abyssal_ship, solo, glowing, turret, cannon, yellow_eyes |
| 1 | 5 |  |  |  |  |  | 1girl, abyssal_ship, black_pants, solo, looking_at_viewer, turret, aqua_eyes, boots, cannon, full_body, machinery, white_background, glowing, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | abyssal_ship | solo | glowing | turret | cannon | yellow_eyes | black_pants | looking_at_viewer | aqua_eyes | boots | full_body | machinery | white_background | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:----------|:---------|:---------|:--------------|:--------------|:--------------------|:------------|:--------|:------------|:------------|:-------------------|:--------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | X | X | X |
|
OEvortex/Vortex-50k | ---
language:
- en
license: other
license_name: hsul
license_link: https://huggingface.co/OEvortex/vortex-3b/raw/main/LICENSE.md
---
|
KK1mo/tedigan_mask | ---
dataset_info:
features:
- name: id
dtype: string
- name: caption
dtype: string
- name: mask
dtype: image
- name: non_edited_image
dtype: image
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 1953997426.2739997
num_examples: 1498
download_size: 1954051980
dataset_size: 1953997426.2739997
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/thany_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of thany (Fire Emblem)
This is the dataset of thany (Fire Emblem), containing 102 images and their tags.
The core tags of this character are `blue_hair, short_hair, blue_eyes, headband, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 102 | 80.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thany_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 102 | 57.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thany_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 187 | 101.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thany_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 102 | 74.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thany_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 187 | 123.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thany_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/thany_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, thighhighs, breastplate, pegasus_knight_uniform_(fire_emblem), skirt, solo, belt, fingerless_gloves, thigh_boots, zettai_ryouiki, open_mouth, spear, full_body, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, hair_ornament, solo, strapless_dress, white_dress, bangs, detached_collar, feather_trim, flower, full_body, holding_bow_(weapon), medium_breasts, open_mouth, shiny_hair, thigh_boots, thighhighs, wedding_dress, white_footwear, high_heel_boots, layered_skirt, smile, bare_shoulders, circlet, earrings, looking_at_viewer, looking_away, simple_background, white_background |
| 2 | 28 |  |  |  |  |  | 1girl, blush, open_mouth, hetero, solo_focus, nipples, sex, penis, 1boy, nude, medium_breasts, mosaic_censoring, vaginal, cum, pussy, sweat, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | thighhighs | breastplate | pegasus_knight_uniform_(fire_emblem) | skirt | solo | belt | fingerless_gloves | thigh_boots | zettai_ryouiki | open_mouth | spear | full_body | white_background | hair_ornament | strapless_dress | white_dress | bangs | detached_collar | feather_trim | flower | holding_bow_(weapon) | medium_breasts | shiny_hair | wedding_dress | white_footwear | high_heel_boots | layered_skirt | smile | bare_shoulders | circlet | earrings | looking_at_viewer | looking_away | simple_background | blush | hetero | solo_focus | nipples | sex | penis | 1boy | nude | mosaic_censoring | vaginal | cum | pussy | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------|:---------------------------------------|:--------|:-------|:-------|:--------------------|:--------------|:-----------------|:-------------|:--------|:------------|:-------------------|:----------------|:------------------|:--------------|:--------|:------------------|:---------------|:---------|:-----------------------|:-----------------|:-------------|:----------------|:-----------------|:------------------|:----------------|:--------|:-----------------|:----------|:-----------|:--------------------|:---------------|:--------------------|:--------|:---------|:-------------|:----------|:------|:--------|:-------|:-------|:-------------------|:----------|:------|:--------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 2 | 28 |  |  |  |  |  | X | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
TalTechNLP/err-newsroom | ---
task_categories:
- summarization
- text2text-generation
language:
- et
---
## ERR Newsroom
This dataset contains news articles from the website of Estonian Public Broadcasting (https://err.ee), from the period of 2016-2022.
Each news article has three text fields: heading, lead-in and text.
The dataset is divided into train, dev and test set. Dev set contains articles from November 2022 and test set from December 2022. The rest of the articles are in the train set.
|
benayas/massive_artificial_5pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 802767
num_examples: 11514
download_size: 250983
dataset_size: 802767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-xsum-19ae30f1-12355647 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP10
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP10
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.