datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
mfi/lotr-book | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2196528.0
num_examples: 268
- name: test
num_bytes: 245880.0
num_examples: 30
download_size: 1125559
dataset_size: 2442408.0
---
# Dataset Card for "lotr-book"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hobson/surname-nationality | ---
license: mit
size_categories: List[str]
source_datasets: List[str]
task_categories:
- token-classification
- text-classification
task_ids:
- named-entity-recognition
pretty_name: Popular Surname Nationality Mapping
tags:
- multilingual
- RNN
- name
- tagging
- nlp
- transliterated
- character-level
- text-tagging
- bias
- classification
- language model
- surname
- ethnicity
- multilabel classification
- natural language
---
# Popular Surname Nationality Mapping
Sample of popular surnames for 30+ countries labeled with nationality (language)
|
TiagoB23/ExperimentalFourthBrainMailingDS | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 18730
num_examples: 10
download_size: 23195
dataset_size: 18730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ExperimentalFourthBrainMailingDS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
speed1/menok | ---
license: openrail
---
|
zolak/twitter_dataset_78_1713102950 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2966436
num_examples: 7242
download_size: 1497620
dataset_size: 2966436
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BohdanPetryshyn/openapi-completion | ---
dataset_info:
features:
- name: file
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 730407234.0961818
num_examples: 4030
download_size: 131492740
dataset_size: 730407234.0961818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BenjaminSombi/jobevaluation | ---
license: apache-2.0
---
|
Deivid457/Thiago-Aquino | ---
license: openrail
---
|
datajuicer/redpajama-cc-2019-30-refined-by-data-juicer | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- data-juicer
- pretraining
size_categories:
- 10M<n<100M
---
# RedPajama -- CommonCrawl-2019-30 (refined by Data-Juicer)
A refined version of CommonCrawl-2019-30 dataset in RedPajama by [Data-Juicer](https://github.com/alibaba/data-juicer). Removing some "bad" samples from the original dataset to make it higher-quality.
This dataset is usually used to pretrain a Large Language Model.
**Notice**: Here is a small subset for previewing. The whole dataset is available [here](https://dail-wlcb.oss-cn-wulanchabu.aliyuncs.com/LLM_data/our_refined_datasets/pretraining/redpajama-cc-refine-results/redpajama-cc-2019-30-refine-result.jsonl) (About 240GB).
## Dataset Information
- Number of samples: 36,557,283 (Keep ~45.08% from the original dataset)
## Refining Recipe
```yaml
# global parameters
project_name: 'Data-Juicer-recipes-cc-2019-30'
dataset_path: '/path/to/your/dataset' # path to your dataset directory or file
export_path: '/path/to/your/dataset.jsonl'
np: 50 # number of subprocess to process your dataset
open_tracer: true
# process schedule
# a list of several process operators with their arguments
process:
- document_simhash_deduplicator:
tokenization: space
window_size: 6
lowercase: true
ignore_pattern: '\p{P}'
num_blocks: 6
hamming_distance: 4
- clean_email_mapper:
- clean_links_mapper:
- fix_unicode_mapper:
- punctuation_normalization_mapper:
- whitespace_normalization_mapper:
- alphanumeric_filter: # 770218
tokenization: false
min_ratio: 0.7489 # 3sigma
max_ratio: 0.8585 # 3sigma
- average_line_length_filter: # for code
max_len: 1500 # < 3sigma (2689) -- 177520
- character_repetition_filter:
rep_len: 10
max_ratio: 0.3 # > 3sigma (0.1491) -- 151703
- flagged_words_filter:
lang: en
tokenization: true
max_ratio: 0.0025 # 3sigma -- 101540
- language_id_score_filter: # remove language filter
min_score: 0.788 # 3sigma -- 1622574
- maximum_line_length_filter: # for code
max_len: 5000 # < 3sigma (8775) -- 485806
- perplexity_filter:
lang: en
max_ppl: 5000 # < 3sigma (6723) -- 676914
- special_characters_filter:
min_ratio: 0.15 # > 3sigma (0.104)
max_ratio: 0.35 # > 3sigma (0.322) -- 859797
- text_length_filter:
max_len: 65589 # 3sigma -- 975142
- words_num_filter:
lang: en
tokenization: true
min_num: 20 # > 3sigma -- 196
max_num: 13030 # 3sigma -- 989078
- word_repetition_filter:
lang: en
tokenization: true
rep_len: 10
max_ratio: 0.279 # 3sigma -- 1716308
``` |
liuyanchen1015/MULTI_VALUE_stsb_zero_plural_after_quantifier | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 39852
num_examples: 234
- name: test
num_bytes: 36189
num_examples: 235
- name: train
num_bytes: 133855
num_examples: 810
download_size: 133672
dataset_size: 209896
---
# Dataset Card for "MULTI_VALUE_stsb_zero_plural_after_quantifier"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lawinsider/uk_ner_contracts | ---
task_categories:
- token-classification
task_ids:
- named-entity-recognition
language:
- uk
pretty_name: UK-NER-contracts
---
### Dataset Description
Legal Contracts Dataset for Training NER Model
This repository contains a specially curated dataset consisting of legal contracts. It is designed for the purpose of training a Named Entity Recognition (NER) model, with the aim to recognize and classify four types of entities in the text:
Contract Type,
Clause Title,
Clause Number,
Definition Title
The dataset includes a broad variety of legal contracts, covering diverse domains such as employment, real estate, services, sale, lease, etc.
Entities in the text have been manually labeled by experts in the field, ensuring high-quality training data for the model.
Each document in the dataset has been annotated in the following format:
(Start_Position, End_Position, Entity_Label)
For example, a clause title may be annotated as follows: (102, 115, 'clause title')
This will assist the NER model in identifying not only the text of the entity, but also its position within the document.
Usage Guidelines
|
zhixiaoni/CROHME_selected_Train_2014_png | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 15733272.15
num_examples: 5618
download_size: 14207546
dataset_size: 15733272.15
---
# Dataset Card for "CROHME_selected_Train_2014_png"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
varun-d/demo-data | ---
license: apache-2.0
---
|
Tvsybkzkmapab/Amharic_ad_generation | ---
language:
- am
license: apache-2.0
---
|
open-llm-leaderboard/details_Sharathhebbar24__code_gpt2_mini_model | ---
pretty_name: Evaluation run of Sharathhebbar24/code_gpt2_mini_model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sharathhebbar24/code_gpt2_mini_model](https://huggingface.co/Sharathhebbar24/code_gpt2_mini_model)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sharathhebbar24__code_gpt2_mini_model\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T15:41:13.540952](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__code_gpt2_mini_model/blob/main/results_2024-02-02T15-41-13.540952.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24939264685687307,\n\
\ \"acc_stderr\": 0.030508768932231183,\n \"acc_norm\": 0.25044580537440064,\n\
\ \"acc_norm_stderr\": 0.03132454191182828,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.39863932434367527,\n\
\ \"mc2_stderr\": 0.01509297997669473\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18600682593856654,\n \"acc_stderr\": 0.01137094018326675,\n\
\ \"acc_norm\": 0.23720136518771331,\n \"acc_norm_stderr\": 0.01243039982926085\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28888667596096396,\n\
\ \"acc_stderr\": 0.004523188431142895,\n \"acc_norm\": 0.31248755228042224,\n\
\ \"acc_norm_stderr\": 0.0046256009167749855\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342343,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342343\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n\
\ \"acc_stderr\": 0.02402225613030824,\n \"acc_norm\": 0.23225806451612904,\n\
\ \"acc_norm_stderr\": 0.02402225613030824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.033711241426263014,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.033711241426263014\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.030748905363909902,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.030748905363909902\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959316,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959316\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882364,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3229357798165138,\n \"acc_stderr\": 0.02004811592341533,\n \"\
acc_norm\": 0.3229357798165138,\n \"acc_norm_stderr\": 0.02004811592341533\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.028963702570791037,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.028963702570791037\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159253,\n \
\ \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03894641120044793,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03894641120044793\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.015491088951494581,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.015491088951494581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.02699219917306436,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.02699219917306436\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034503,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\
\ \"acc_stderr\": 0.036942843353378,\n \"acc_norm\": 0.18181818181818182,\n\
\ \"acc_norm_stderr\": 0.036942843353378\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23265306122448978,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.23265306122448978,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18674698795180722,\n\
\ \"acc_stderr\": 0.030338749144500597,\n \"acc_norm\": 0.18674698795180722,\n\
\ \"acc_norm_stderr\": 0.030338749144500597\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.39863932434367527,\n\
\ \"mc2_stderr\": 0.01509297997669473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5114443567482242,\n \"acc_stderr\": 0.014048804199859332\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Sharathhebbar24/code_gpt2_mini_model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|arc:challenge|25_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|gsm8k|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hellaswag|10_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T15-41-13.540952.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T15-41-13.540952.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- '**/details_harness|winogrande|5_2024-02-02T15-41-13.540952.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T15-41-13.540952.parquet'
- config_name: results
data_files:
- split: 2024_02_02T15_41_13.540952
path:
- results_2024-02-02T15-41-13.540952.parquet
- split: latest
path:
- results_2024-02-02T15-41-13.540952.parquet
---
# Dataset Card for Evaluation run of Sharathhebbar24/code_gpt2_mini_model
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sharathhebbar24/code_gpt2_mini_model](https://huggingface.co/Sharathhebbar24/code_gpt2_mini_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sharathhebbar24__code_gpt2_mini_model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T15:41:13.540952](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__code_gpt2_mini_model/blob/main/results_2024-02-02T15-41-13.540952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24939264685687307,
"acc_stderr": 0.030508768932231183,
"acc_norm": 0.25044580537440064,
"acc_norm_stderr": 0.03132454191182828,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.39863932434367527,
"mc2_stderr": 0.01509297997669473
},
"harness|arc:challenge|25": {
"acc": 0.18600682593856654,
"acc_stderr": 0.01137094018326675,
"acc_norm": 0.23720136518771331,
"acc_norm_stderr": 0.01243039982926085
},
"harness|hellaswag|10": {
"acc": 0.28888667596096396,
"acc_stderr": 0.004523188431142895,
"acc_norm": 0.31248755228042224,
"acc_norm_stderr": 0.0046256009167749855
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342343,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342343
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.15,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.02402225613030824,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.02402225613030824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.033711241426263014,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.033711241426263014
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.030748905363909902,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.030748905363909902
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959316,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959316
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882364,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3229357798165138,
"acc_stderr": 0.02004811592341533,
"acc_norm": 0.3229357798165138,
"acc_norm_stderr": 0.02004811592341533
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.028963702570791037,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.028963702570791037
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159253,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044793,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044793
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.015491088951494581,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.015491088951494581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.02699219917306436,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.02699219917306436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034503,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33088235294117646,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.33088235294117646,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23265306122448978,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.23265306122448978,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18674698795180722,
"acc_stderr": 0.030338749144500597,
"acc_norm": 0.18674698795180722,
"acc_norm_stderr": 0.030338749144500597
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.39863932434367527,
"mc2_stderr": 0.01509297997669473
},
"harness|winogrande|5": {
"acc": 0.5114443567482242,
"acc_stderr": 0.014048804199859332
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Rewcifer/clean_trainset_2000_cutoff_llama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 394450486.10182
num_examples: 100767
download_size: 90442844
dataset_size: 394450486.10182
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "clean_trainset_2000_cutoff_llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fia24/filtered_lemma41kV0.0.2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Inflected_Word
dtype: string
- name: Lemma
dtype: string
splits:
- name: train
num_bytes: 1794357.4941723635
num_examples: 28553
- name: test
num_bytes: 224349.67443684858
num_examples: 3570
- name: val
num_bytes: 224286.83139078785
num_examples: 3569
download_size: 1201505
dataset_size: 2242994.0
---
# Dataset Card for "filtered_lemma41kV0.0.2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mikhail-panzo/ceb-fleur | ---
dataset_info:
features:
- name: speaker_embeddings
sequence: float32
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
splits:
- name: train
num_bytes: 580446080
num_examples: 2147
download_size: 574060157
dataset_size: 580446080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_50_1713094189 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3203535
num_examples: 7960
download_size: 1585626
dataset_size: 3203535
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mii-community/liberliber-cleaned | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1431377059
num_examples: 4510
download_size: 880723375
dataset_size: 1431377059
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SUSTech/mt_bench_ppl_large | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: category
dtype: string
- name: turn
list:
- name: content
dtype: string
- name: role
dtype: string
- name: reference
sequence: string
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
- name: finished
dtype: bool
- name: score
dtype: float64
splits:
- name: train
num_bytes: 226809
num_examples: 80
download_size: 106782
dataset_size: 226809
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
learningbot/hadoop | ---
license: gpl-3.0
---
|
AlekseyKorshuk/chai-synthetic-pairwise | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1690616961
num_examples: 41128
- name: test
num_bytes: 47839521
num_examples: 4570
download_size: 781208088
dataset_size: 1738456482
---
# Dataset Card for "chai-synthetic-pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yatsby/persona_chat | ---
language:
- ko
task_categories:
- conversational
dataset_info:
features:
- name: persona
struct:
- name: 나이
dtype: string
- name: 비밀
dtype: string
- name: 성격
dtype: string
- name: 외모
dtype: string
- name: 이름
dtype: string
- name: 이상
dtype: string
- name: 직업
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 47910381
num_examples: 21973
- name: valid
num_bytes: 2519850
num_examples: 1160
download_size: 25171790
dataset_size: 50430231
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
gemini 에서 생성한 Persona 와 질문, 답변을 담은 데이터셋입니다. |
cellfabrik/algae | ---
license: apache-2.0
---
|
jamesagilesoda/dummy-text-1k | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: lang
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1671083.6363636365
num_examples: 1000
- name: test
num_bytes: 167108.36363636365
num_examples: 100
download_size: 1071562
dataset_size: 1838192.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
gg-ai/dataset-072623 | ---
dataset_info:
features:
- name: text
dtype: string
- name: sent
dtype: int64
- name: text_0
dtype: string
- name: text_1
dtype: string
- name: text_2
dtype: string
- name: text_3
dtype: string
splits:
- name: train
num_bytes: 2194163
num_examples: 3000
- name: test
num_bytes: 331198
num_examples: 450
download_size: 1603495
dataset_size: 2525361
---
# Dataset Card for "dataset-072623"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MBJC/diffsinger_keqing | ---
license: mit
---
|
zelalt/MLPapers-Arxiv | ---
dataset_info:
features:
- name: title
dtype: string
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 145682026
num_examples: 117592
download_size: 83722678
dataset_size: 145682026
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MLPapers-Arxiv"
Original Dataset: [CShorten/ML-ArXiv-Papers](https://huggingface.co/datasets/CShorten/ML-ArXiv-Papers)
|
rajteer/Natural_disaster_tweets | ---
license: mit
---
|
cmglmsr/Impartial-GenAI-Dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 31254
num_examples: 3
download_size: 32194
dataset_size: 31254
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Impartial-GenAI-Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_argilla__notus-8x7b-experiment | ---
pretty_name: Evaluation run of argilla/notus-8x7b-experiment
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [argilla/notus-8x7b-experiment](https://huggingface.co/argilla/notus-8x7b-experiment)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_argilla__notus-8x7b-experiment\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-24T21:16:18.856195](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notus-8x7b-experiment/blob/main/results_2023-12-24T21-16-18.856195.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7124178625456173,\n\
\ \"acc_stderr\": 0.030199955715548343,\n \"acc_norm\": 0.7160635607907738,\n\
\ \"acc_norm_stderr\": 0.0307822236181654,\n \"mc1\": 0.5079559363525091,\n\
\ \"mc1_stderr\": 0.01750128507455182,\n \"mc2\": 0.6579117349463197,\n\
\ \"mc2_stderr\": 0.015011154188590699\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518822,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520762\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.688707428799044,\n\
\ \"acc_stderr\": 0.004620758579628659,\n \"acc_norm\": 0.8773152758414658,\n\
\ \"acc_norm_stderr\": 0.0032740447231806207\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562429,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562429\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0326926380614177,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0326926380614177\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.676595744680851,\n \"acc_stderr\": 0.030579442773610334,\n \"\
acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8516129032258064,\n\
\ \"acc_stderr\": 0.020222737554330378,\n \"acc_norm\": 0.8516129032258064,\n\
\ \"acc_norm_stderr\": 0.020222737554330378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6157635467980296,\n \"acc_stderr\": 0.03422398565657551,\n\
\ \"acc_norm\": 0.6157635467980296,\n \"acc_norm_stderr\": 0.03422398565657551\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343336,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343336\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295159,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295159\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8067226890756303,\n \"acc_stderr\": 0.025649470265889183,\n\
\ \"acc_norm\": 0.8067226890756303,\n \"acc_norm_stderr\": 0.025649470265889183\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8788990825688073,\n \"acc_stderr\": 0.013987618292389713,\n \"\
acc_norm\": 0.8788990825688073,\n \"acc_norm_stderr\": 0.013987618292389713\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568617,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595694,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8773946360153256,\n\
\ \"acc_stderr\": 0.011728672144131563,\n \"acc_norm\": 0.8773946360153256,\n\
\ \"acc_norm_stderr\": 0.011728672144131563\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46033519553072627,\n\
\ \"acc_stderr\": 0.016669799592112032,\n \"acc_norm\": 0.46033519553072627,\n\
\ \"acc_norm_stderr\": 0.016669799592112032\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.021828596053108395,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.021828596053108395\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060006,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060006\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5602836879432624,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.545632333767927,\n\
\ \"acc_stderr\": 0.01271694172073482,\n \"acc_norm\": 0.545632333767927,\n\
\ \"acc_norm_stderr\": 0.01271694172073482\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.024880971512294254,\n\
\ \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.024880971512294254\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803404,\n \
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.026711430555538405,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.026711430555538405\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5079559363525091,\n\
\ \"mc1_stderr\": 0.01750128507455182,\n \"mc2\": 0.6579117349463197,\n\
\ \"mc2_stderr\": 0.015011154188590699\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305889\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6163760424564063,\n \
\ \"acc_stderr\": 0.01339423858493816\n }\n}\n```"
repo_url: https://huggingface.co/argilla/notus-8x7b-experiment
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|arc:challenge|25_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|gsm8k|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hellaswag|10_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T21-16-18.856195.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T21-16-18.856195.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- '**/details_harness|winogrande|5_2023-12-24T21-16-18.856195.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-24T21-16-18.856195.parquet'
- config_name: results
data_files:
- split: 2023_12_24T21_16_18.856195
path:
- results_2023-12-24T21-16-18.856195.parquet
- split: latest
path:
- results_2023-12-24T21-16-18.856195.parquet
---
# Dataset Card for Evaluation run of argilla/notus-8x7b-experiment
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [argilla/notus-8x7b-experiment](https://huggingface.co/argilla/notus-8x7b-experiment) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_argilla__notus-8x7b-experiment",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T21:16:18.856195](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notus-8x7b-experiment/blob/main/results_2023-12-24T21-16-18.856195.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7124178625456173,
"acc_stderr": 0.030199955715548343,
"acc_norm": 0.7160635607907738,
"acc_norm_stderr": 0.0307822236181654,
"mc1": 0.5079559363525091,
"mc1_stderr": 0.01750128507455182,
"mc2": 0.6579117349463197,
"mc2_stderr": 0.015011154188590699
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518822,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520762
},
"harness|hellaswag|10": {
"acc": 0.688707428799044,
"acc_stderr": 0.004620758579628659,
"acc_norm": 0.8773152758414658,
"acc_norm_stderr": 0.0032740447231806207
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8516129032258064,
"acc_stderr": 0.020222737554330378,
"acc_norm": 0.8516129032258064,
"acc_norm_stderr": 0.020222737554330378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.03422398565657551,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.03422398565657551
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343336,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295159,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295159
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8067226890756303,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.8067226890756303,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8788990825688073,
"acc_stderr": 0.013987618292389713,
"acc_norm": 0.8788990825688073,
"acc_norm_stderr": 0.013987618292389713
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568617,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595694,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462469,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436186,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8773946360153256,
"acc_stderr": 0.011728672144131563,
"acc_norm": 0.8773946360153256,
"acc_norm_stderr": 0.011728672144131563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46033519553072627,
"acc_stderr": 0.016669799592112032,
"acc_norm": 0.46033519553072627,
"acc_norm_stderr": 0.016669799592112032
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.021828596053108395,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.021828596053108395
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060006,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060006
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.545632333767927,
"acc_stderr": 0.01271694172073482,
"acc_norm": 0.545632333767927,
"acc_norm_stderr": 0.01271694172073482
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.024880971512294254,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.024880971512294254
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803404,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.026711430555538405,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.026711430555538405
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5079559363525091,
"mc1_stderr": 0.01750128507455182,
"mc2": 0.6579117349463197,
"mc2_stderr": 0.015011154188590699
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305889
},
"harness|gsm8k|5": {
"acc": 0.6163760424564063,
"acc_stderr": 0.01339423858493816
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
omar47/dummy_en-asr | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
struct:
- name: bytes
dtype: binary
- name: path
dtype: string
- name: text
dtype: string
splits:
- name: test
num_bytes: 5333955
num_examples: 40
- name: validation
num_bytes: 3749784
num_examples: 40
- name: train
num_bytes: 13316202
num_examples: 60
download_size: 21482093
dataset_size: 22399941
---
# Dataset Card for "dummy_en-asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
2-jp/teste | ---
license: openrail
---
|
ashwathjadhav23/Spanish_MLM_3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3451474
num_examples: 25000
download_size: 1919406
dataset_size: 3451474
---
# Dataset Card for "Spanish_MLM_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cognitive-Lab/Aya_Hindi | ---
dataset_info:
- config_name: complete_dataset
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 5634135057
num_examples: 3771709
download_size: 1626230714
dataset_size: 5634135057
- config_name: templated_hindi_headline
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 915323132
num_examples: 94217
download_size: 192571468
dataset_size: 915323132
- config_name: templated_hindi_news
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 436136894
num_examples: 42524
download_size: 89441706
dataset_size: 436136894
- config_name: templated_indic_paraphrase
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 731975
num_examples: 1001
download_size: 241632
dataset_size: 731975
- config_name: templated_indic_sentiment
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 730262
num_examples: 1156
download_size: 299936
dataset_size: 730262
- config_name: templated_mintaka
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 18391211
num_examples: 56000
download_size: 3894945
dataset_size: 18391211
- config_name: templated_ntx_llm
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 1185419
num_examples: 506
download_size: 128912
dataset_size: 1185419
- config_name: templated_xlel_wd
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 6084765
num_examples: 3940
download_size: 2157019
dataset_size: 6084765
- config_name: translated_adversarial_qa
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 22985920
num_examples: 10000
download_size: 5618356
dataset_size: 22985920
- config_name: translated_cnn_dailymail
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 598585665
num_examples: 100000
download_size: 218762546
dataset_size: 598585665
- config_name: translated_dolly
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 30828048
num_examples: 14808
download_size: 11858598
dataset_size: 30828048
- config_name: translated_flan_coqa
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 39119861
num_examples: 6409
download_size: 15029790
dataset_size: 39119861
- config_name: translated_flan_cot
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 98934248
num_examples: 91910
download_size: 33869605
dataset_size: 98934248
- config_name: translated_flan_gem_wiki
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 167881959
num_examples: 27147
download_size: 59957637
dataset_size: 167881959
- config_name: translated_flan_lambada
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 3388337
num_examples: 4279
download_size: 1272013
dataset_size: 3388337
- config_name: translated_flan_qa
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 452586
num_examples: 540
download_size: 158337
dataset_size: 452586
- config_name: translated_hotpotqa
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 169705823
num_examples: 355476
download_size: 50061586
dataset_size: 169705823
- config_name: translated_joke_explaination
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 1385133
num_examples: 754
download_size: 269690
dataset_size: 1385133
- config_name: translated_mintaka
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 5854298
num_examples: 14000
download_size: 943132
dataset_size: 5854298
- config_name: translated_nqopen
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 53305791
num_examples: 175850
download_size: 14829292
dataset_size: 53305791
- config_name: translated_paws
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 44491519
num_examples: 49401
download_size: 5853813
dataset_size: 44491519
- config_name: translated_piqa
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 18583099
num_examples: 16113
download_size: 5025762
dataset_size: 18583099
- config_name: translated_soda
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 1167631298
num_examples: 1191582
download_size: 300524712
dataset_size: 1167631298
- config_name: translated_wiki_split
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 994661567
num_examples: 989944
download_size: 304386263
dataset_size: 994661567
- config_name: translated_wikiqa
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 717832
num_examples: 1040
download_size: 258651
dataset_size: 717832
- config_name: translated_xlel_wd
features:
- name: targets
dtype: string
- name: id
dtype: int64
- name: split
dtype: string
- name: sub_dataset_name
dtype: string
- name: task_type
dtype: string
- name: inputs
dtype: string
- name: template_id
dtype: int64
- name: language
dtype: string
- name: script
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 837038415
num_examples: 523112
download_size: 308592573
dataset_size: 837038415
configs:
- config_name: complete_dataset
data_files:
- split: train
path: complete_dataset/train-*
- config_name: templated_hindi_headline
data_files:
- split: train
path: templated_hindi_headline/train-*
- config_name: templated_hindi_news
data_files:
- split: train
path: templated_hindi_news/train-*
- config_name: templated_indic_paraphrase
data_files:
- split: train
path: templated_indic_paraphrase/train-*
- config_name: templated_indic_sentiment
data_files:
- split: train
path: templated_indic_sentiment/train-*
- config_name: templated_mintaka
data_files:
- split: train
path: templated_mintaka/train-*
- config_name: templated_ntx_llm
data_files:
- split: train
path: templated_ntx_llm/train-*
- config_name: templated_xlel_wd
data_files:
- split: train
path: templated_xlel_wd/train-*
- config_name: translated_adversarial_qa
data_files:
- split: train
path: translated_adversarial_qa/train-*
- config_name: translated_cnn_dailymail
data_files:
- split: train
path: translated_cnn_dailymail/train-*
- config_name: translated_dolly
data_files:
- split: train
path: translated_dolly/train-*
- config_name: translated_flan_coqa
data_files:
- split: train
path: translated_flan_coqa/train-*
- config_name: translated_flan_cot
data_files:
- split: train
path: translated_flan_cot/train-*
- config_name: translated_flan_gem_wiki
data_files:
- split: train
path: translated_flan_gem_wiki/train-*
- config_name: translated_flan_lambada
data_files:
- split: train
path: translated_flan_lambada/train-*
- config_name: translated_flan_qa
data_files:
- split: train
path: translated_flan_qa/train-*
- config_name: translated_hotpotqa
data_files:
- split: train
path: translated_hotpotqa/train-*
- config_name: translated_joke_explaination
data_files:
- split: train
path: translated_joke_explaination/train-*
- config_name: translated_mintaka
data_files:
- split: train
path: translated_mintaka/train-*
- config_name: translated_nqopen
data_files:
- split: train
path: translated_nqopen/train-*
- config_name: translated_paws
data_files:
- split: train
path: translated_paws/train-*
- config_name: translated_piqa
data_files:
- split: train
path: translated_piqa/train-*
- config_name: translated_soda
data_files:
- split: train
path: translated_soda/train-*
- config_name: translated_wiki_split
data_files:
- split: train
path: translated_wiki_split/train-*
- config_name: translated_wikiqa
data_files:
- split: train
path: translated_wikiqa/train-*
- config_name: translated_xlel_wd
data_files:
- split: train
path: translated_xlel_wd/train-*
license: apache-2.0
language:
- en
- hi
size_categories:
- 1M<n<10M
---
# Aya_Hindi
This Dataset is curated from the original [Aya-Collection](https://huggingface.co/datasets/CohereForAI/aya_collection) dataset that was open-sourced by [Cohere](https://cohere.com/research) under the [Apache-2.0](https://choosealicense.com/licenses/apache-2.0/) license.
The Aya Collection is a massive multilingual collection comprising 513 million instances of prompts and completions that cover a wide range of tasks. This collection uses instruction-style templates from fluent speakers and applies them to a curated list of datasets. It also includes translations of instruction-style datasets into 101 languages. The Aya Dataset, a human-curated multilingual instruction and response dataset, is part of this collection. Refer to our paper for more details about the collection.
### Motivations & Intentions
The original dataset is large and more task-specific than language-specific. To carry out a task specific to the Indic language, one would previously have needed to download the entire dataset (~600 GB) and filter it.
As we were training an Indic LLm internally, we filtered the dataset by language and curated this dataset.
You can find all the Indic-language specific datasets - [here](https://huggingface.co/collections/Cognitive-Lab/aya-indic-suite-65eaa0e34a2307f30bbd55e5).
## **Data Instances**
An example of a `train` instance looks as follows:
```yaml
{'id': 246001,
'inputs': 'The following query in English is taken from the geography category. What could be the answer to the question?\nWhat is the seventh tallest mountain in North America?',
'targets': 'The answer is Mount Lucania.',
'dataset_name': 'Mintaka-inst',
'sub_dataset_name': '-',
'task_type': 'question-answering',
'template_id': 3,
'language': 'eng',
'split': 'train',
'script': 'Latn'
}
```
## **Data Fields**
The data fields are the same among all splits:
- `id:` Unique id of the data point
- `inputs:` Prompt or input to the language model.
- `targets:` Completion or output of the language model.
- `dataset_name:` The name of the source dataset that the data point was taken from
- `sub_dataset_name:` If the source is a collection, this field indicates which part of that collection the data point was taken from. If it is not a collection, this field is left blank.
- `task_type:` The task type that this conversation belongs to.
- `template_id`: The id of the template applied to this data point.
- `language:` The ISO code of the dialect of the conversation.
- `script:` The script of the language.
- `split:` Indicates whether the data point is part of the `train` or the `test` split.
## **Licensing Information**
This dataset can be used for any purpose, whether academic or commercial, under the terms of the **[Apache 2.0](https://opensource.org/license/apache-2-0)** License.
Citation
```yaml
@misc{singh2024aya,
title={Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning},
author={Shivalika Singh and Freddie Vargus and Daniel Dsouza and Börje F. Karlsson and Abinaya Mahendiran and Wei-Yin Ko and Herumb Shandilya and Jay Patel and Deividas Mataciunas and Laura OMahony and Mike Zhang and Ramith Hettiarachchi and Joseph Wilson and Marina Machado and Luisa Souza Moura and Dominik Krzemiński and Hakimeh Fadaei and Irem Ergün and Ifeoma Okoh and Aisha Alaagib and Oshan Mudannayake and Zaid Alyafeai and Vu Minh Chien and Sebastian Ruder and Surya Guthikonda and Emad A. Alghamdi and Sebastian Gehrmann and Niklas Muennighoff and Max Bartolo and Julia Kreutzer and Ahmet Üstün and Marzieh Fadaee and Sara Hooker},
year={2024},
eprint={2402.06619},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
krishna8421/chats_with_title | ---
license: mit
---
|
untilthend/lite | ---
license: openrail
---
|
npx/gg_2907 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': combat
'1': destroyed_buildings
'2': fire
'3': human_aid_rehabilitation
'4': military_vehicles
splits:
- name: train
num_bytes: 153077639.0
num_examples: 11500
- name: test
num_bytes: 2264965.0
num_examples: 184
download_size: 157804281
dataset_size: 155342604.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
andersonbcdefg/anthropic-hh-rlhf-conversations-with-toxicities | ---
dataset_info:
features:
- name: messages
sequence: string
- name: length
dtype: int64
- name: toxicity
dtype: float64
splits:
- name: train
num_bytes: 117886688
num_examples: 104876
download_size: 68186422
dataset_size: 117886688
---
# Dataset Card for "anthropic-hh-rlhf-conversations-with-toxicities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling | ---
pretty_name: Evaluation run of AIGym/deepseek-coder-1.3b-chat-and-function-calling
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AIGym/deepseek-coder-1.3b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat-and-function-calling)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T23:54:42.230347](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling/blob/main/results_2024-02-04T23-54-42.230347.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26966838532529325,\n\
\ \"acc_stderr\": 0.03152856048648718,\n \"acc_norm\": 0.2711365940971498,\n\
\ \"acc_norm_stderr\": 0.03228813751443345,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.43371704166991554,\n\
\ \"mc2_stderr\": 0.01505484479340333\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132865,\n\
\ \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3301135232025493,\n\
\ \"acc_stderr\": 0.004692926794268451,\n \"acc_norm\": 0.3926508663612826,\n\
\ \"acc_norm_stderr\": 0.004873421833291587\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.038270523579507554,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.038270523579507554\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.16184971098265896,\n\
\ \"acc_stderr\": 0.028083594279575765,\n \"acc_norm\": 0.16184971098265896,\n\
\ \"acc_norm_stderr\": 0.028083594279575765\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.30638297872340425,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.30638297872340425,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.02345603738398202,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.02345603738398202\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.03619604524124251,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.03619604524124251\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2709677419354839,\n \"acc_stderr\": 0.02528441611490016,\n \"\
acc_norm\": 0.2709677419354839,\n \"acc_norm_stderr\": 0.02528441611490016\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.03031371053819889,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.03031371053819889\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547836,\n \"\
acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.30392156862745096,\n \"acc_stderr\": 0.032282103870378914,\n \"\
acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.032282103870378914\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n\
\ \"acc_stderr\": 0.029763779406874975,\n \"acc_norm\": 0.26905829596412556,\n\
\ \"acc_norm_stderr\": 0.029763779406874975\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.30578512396694213,\n \"acc_stderr\": 0.04205953933884124,\n \"\
acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.04205953933884124\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n\
\ \"acc_stderr\": 0.030463656747340265,\n \"acc_norm\": 0.3162393162393162,\n\
\ \"acc_norm_stderr\": 0.030463656747340265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n\
\ \"acc_stderr\": 0.016160871405127536,\n \"acc_norm\": 0.28607918263090676,\n\
\ \"acc_norm_stderr\": 0.016160871405127536\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500118,\n\
\ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500118\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364546,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364546\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2711864406779661,\n\
\ \"acc_stderr\": 0.011354581451622985,\n \"acc_norm\": 0.2711864406779661,\n\
\ \"acc_norm_stderr\": 0.011354581451622985\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33455882352941174,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.33455882352941174,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355568,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355568\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.43371704166991554,\n\
\ \"mc2_stderr\": 0.01505484479340333\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.014044390401612976\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \
\ \"acc_stderr\": 0.00500021260077329\n }\n}\n```"
repo_url: https://huggingface.co/AIGym/deepseek-coder-1.3b-chat-and-function-calling
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|arc:challenge|25_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|gsm8k|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hellaswag|10_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T23-54-42.230347.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- '**/details_harness|winogrande|5_2024-02-04T23-54-42.230347.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T23-54-42.230347.parquet'
- config_name: results
data_files:
- split: 2024_02_04T23_54_42.230347
path:
- results_2024-02-04T23-54-42.230347.parquet
- split: latest
path:
- results_2024-02-04T23-54-42.230347.parquet
---
# Dataset Card for Evaluation run of AIGym/deepseek-coder-1.3b-chat-and-function-calling
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AIGym/deepseek-coder-1.3b-chat-and-function-calling](https://huggingface.co/AIGym/deepseek-coder-1.3b-chat-and-function-calling) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T23:54:42.230347](https://huggingface.co/datasets/open-llm-leaderboard/details_AIGym__deepseek-coder-1.3b-chat-and-function-calling/blob/main/results_2024-02-04T23-54-42.230347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26966838532529325,
"acc_stderr": 0.03152856048648718,
"acc_norm": 0.2711365940971498,
"acc_norm_stderr": 0.03228813751443345,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.43371704166991554,
"mc2_stderr": 0.01505484479340333
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132865,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.3301135232025493,
"acc_stderr": 0.004692926794268451,
"acc_norm": 0.3926508663612826,
"acc_norm_stderr": 0.004873421833291587
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.038270523579507554,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.038270523579507554
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.16184971098265896,
"acc_stderr": 0.028083594279575765,
"acc_norm": 0.16184971098265896,
"acc_norm_stderr": 0.028083594279575765
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.30638297872340425,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.30638297872340425,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.02345603738398202,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.02345603738398202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124251,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124251
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547836,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.032282103870378914,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.032282103870378914
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874975,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874975
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.04205953933884124,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.04205953933884124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.030463656747340265,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.030463656747340265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127536,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500118,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500118
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364546,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364546
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432414,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2711864406779661,
"acc_stderr": 0.011354581451622985,
"acc_norm": 0.2711864406779661,
"acc_norm_stderr": 0.011354581451622985
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33455882352941174,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.33455882352941174,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355568,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355568
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.43371704166991554,
"mc2_stderr": 0.01505484479340333
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612976
},
"harness|gsm8k|5": {
"acc": 0.03411675511751327,
"acc_stderr": 0.00500021260077329
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Sai-Manisha/Fine-tuning-feb-5 | ---
license: mit
---
|
thobauma/harmless-poisoned-0.03-SUDO-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_uninflect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1702
num_examples: 10
- name: test
num_bytes: 11628
num_examples: 40
- name: train
num_bytes: 15115
num_examples: 76
download_size: 17799
dataset_size: 28445
---
# Dataset Card for "MULTI_VALUE_wnli_uninflect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wetdog/TUT-urban-acoustic-scenes-2018-development | ---
dataset_info:
features:
- name: scene_label
dtype: string
- name: identifier
dtype: string
- name: source_label
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 24883936611.28
num_examples: 8640
download_size: 24885037396
dataset_size: 24883936611.28
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: afl-3.0
task_categories:
- audio-classification
size_categories:
- 1K<n<10K
---
# Dataset Card for "TUT-urban-acoustic-scenes-2018-development"
## Dataset Description
- **Homepage: https://zenodo.org/record/1228142**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact: Toni Heittola (toni.heittola@tut.fi, http://www.cs.tut.fi/~heittolt/)**
### Dataset Summary
TUT Urban Acoustic Scenes 2018 development dataset consists of 10-seconds audio segments from 10 acoustic scenes:
Airport - airport
Indoor shopping mall - shopping_mall
Metro station - metro_station
Pedestrian street - street_pedestrian
Public square - public_square
Street with medium level of traffic - street_traffic
Travelling by a tram - tram
Travelling by a bus - bus
Travelling by an underground metro - metro
Urban park - park
Each acoustic scene has 864 segments (144 minutes of audio). The dataset contains in total 24 hours of audio.
The dataset was collected in Finland by Tampere University of Technology between 02/2018 - 03/2018.
The data collection has received funding from the European Research Council under the ERC Grant Agreement 637422 EVERYSOUND.
### Supported Tasks and Leaderboards
- `audio-classification`: The dataset can be used to train a model for [TASK NAME], which consists in [TASK DESCRIPTION]. Success on this task is typically measured by achieving a *high/low* [metric name](https://huggingface.co/metrics/metric_name).
- The ([model name](https://huggingface.co/model_name) or [model class](https://huggingface.co/transformers/model_doc/model_class.html)) model currently achieves the following score. *[IF A LEADERBOARD IS AVAILABLE]:* This task has an active leaderboard
- which can be found at [leaderboard url]() and ranks models based on [metric name](https://huggingface.co/metrics/metric_name) while also reporting [other metric name](https://huggingface.co/metrics/other_metric_name).
## Dataset Structure
### Data Instances
```
{
'scene_label': 'airport',
'identifier': 'barcelona-0',
'source_label': 'a',
'audio': {'path': '/data/airport-barcelona-0-0-a.wav'
'array': array([-1.91628933e-04, -1.18494034e-04, -1.87635422e-04, ...,
4.90546227e-05, -4.98890877e-05, -4.66108322e-05]),
'sampling_rate': 48000}
}
```
### Data Fields
- `scene_label`: acoustic scene label from the 10 class set,
- `identifier`: city-location id 'barcelona-0',
- `source_label: device id, for this dataset is always the same 'a',
Filenames of the dataset have the following pattern:
[scene label]-[city]-[location id]-[segment id]-[device id].wav
### Data Splits
A suggested training/test partitioning of the development set is provided in order to make results reported with this dataset uniform. The partitioning is done such that the segments recorded at the same location are included into the same subset - either training or testing. The partitioning is done aiming for a 70/30 ratio between the number of segments in training and test subsets while taking into account recording locations, and selecting the closest available option.
| Scene class | Train / Segments | Train / Locations | Test / Segments | Test / Locations |
| ------------------ | ---------------- | ----------------- | --------------- | ---------------- |
| Airport | 599 | 15 | 265 | 7 |
| Bus | 622 | 26 | 242 | 10 |
| Metro | 603 | 20 | 261 | 9 |
| Metro station | 605 | 28 | 259 | 12 |
| Park | 622 | 18 | 242 | 7 |
| Public square | 648 | 18 | 216 | 6 |
| Shopping mall | 585 | 16 | 279 | 6 |
| Street, pedestrian | 617 | 20 | 247 | 8 |
| Street, traffic | 618 | 18 | 246 | 7 |
| Tram | 603 | 24 | 261 | 11 |
| **Total** | **6122** | **203** | **2518** | **83** |
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The dataset was recorded in six large European cities: Barcelona, Helsinki, London, Paris, Stockholm, and Vienna. For all acoustic scenes, audio was captured in multiple locations: different streets, different parks, different shopping malls. In each location, multiple 2-3 minute long audio recordings were captured in a few slightly different positions (2-4) within the selected location. Collected audio material was cut into segments of 10 seconds length.
The equipment used for recording consists of a binaural [Soundman OKM II Klassik/studio A3](http://www.soundman.de/en/products/) electret in-ear microphone and a [Zoom F8](https://www.zoom.co.jp/products/handy-recorder/zoom-f8-multitrack-field-recorder) audio recorder using 48 kHz sampling rate and 24 bit resolution. During the recording, the microphones were worn by the recording person in the ears, and head movement was kept to minimum.
### Annotations
#### Annotation process
Post-processing of the recorded audio involves aspects related to privacy of recorded individuals, and possible errors in the recording process. Some interferences from mobile phones are audible, but are considered part of real-world recording process.
#### Who are the annotators?
* Ronal Bejarano Rodriguez
* Eemi Fagerlund
* Aino Koskimies
* Toni Heittola
### Personal and Sensitive Information
The material was screened for content, and segments containing close microphone conversation were eliminated.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Toni Heittola (toni.heittola@tut.fi, http://www.cs.tut.fi/~heittolt/)
Annamaria Mesaros (annamaria.mesaros@tut.fi, http://www.cs.tut.fi/~mesaros/)
Tuomas Virtanen (tuomas.virtanen@tut.fi, http://www.cs.tut.fi/~tuomasv/)
### Licensing Information
Copyright (c) 2018 Tampere University of Technology and its licensors
All rights reserved.
Permission is hereby granted, without written agreement and without license or royalty
fees, to use and copy the TUT Urban Acoustic Scenes 2018 (“Work”) described in this document
and composed of audio and metadata. This grant is only for experimental and non-commercial
purposes, provided that the copyright notice in its entirety appear in all copies of this Work,
and the original source of this Work, (Audio Research Group from Laboratory of Signal
Processing at Tampere University of Technology),
is acknowledged in any publication that reports research using this Work.
Any commercial use of the Work or any part thereof is strictly prohibited.
Commercial use include, but is not limited to:
- selling or reproducing the Work
- selling or distributing the results or content achieved by use of the Work
- providing services by using the Work.
IN NO EVENT SHALL TAMPERE UNIVERSITY OF TECHNOLOGY OR ITS LICENSORS BE LIABLE TO ANY PARTY
FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE
OF THIS WORK AND ITS DOCUMENTATION, EVEN IF TAMPERE UNIVERSITY OF TECHNOLOGY OR ITS
LICENSORS HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
TAMPERE UNIVERSITY OF TECHNOLOGY AND ALL ITS LICENSORS SPECIFICALLY DISCLAIMS ANY
WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS FOR A PARTICULAR PURPOSE. THE WORK PROVIDED HEREUNDER IS ON AN "AS IS" BASIS, AND
THE TAMPERE UNIVERSITY OF TECHNOLOGY HAS NO OBLIGATION TO PROVIDE MAINTENANCE, SUPPORT,
UPDATES, ENHANCEMENTS, OR MODIFICATIONS.
### Citation Information
[](https://doi.org/10.5281/zenodo.1228142)
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlexAndriten/venvTest | ---
license: unknown
---
|
Tippawan/snm-siriraj | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 34050
num_examples: 9
- name: validation
num_bytes: 7492
num_examples: 2
- name: test
num_bytes: 3569
num_examples: 1
download_size: 14144
dataset_size: 45111
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
fmagot01/videos_0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: video_data
dtype: binary
- name: duration_seconds
dtype: int64
- name: video_path
dtype: string
splits:
- name: train
num_bytes: 5759485
num_examples: 5
download_size: 5747829
dataset_size: 5759485
---
# Dataset Card for "videos_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-astronomy-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7144
num_examples: 5
- name: test
num_bytes: 498344
num_examples: 152
download_size: 15249
dataset_size: 505488
---
# Dataset Card for "mmlu-astronomy-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datajuicer/redpajama-cc-2021-04-refined-by-data-juicer | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- data-juicer
- pretraining
size_categories:
- 10M<n<100M
---
# RedPajama -- CommonCrawl-2021-04 (refined by Data-Juicer)
A refined version of CommonCrawl-2021-04 dataset in RedPajama by [Data-Juicer](https://github.com/alibaba/data-juicer). Removing some "bad" samples from the original dataset to make it higher-quality.
This dataset is usually used to pretrain a Large Language Model.
**Notice**: Here is a small subset for previewing. The whole dataset is available [here](https://dail-wlcb.oss-cn-wulanchabu.aliyuncs.com/LLM_data/our_refined_datasets/pretraining/redpajama-cc-refine-results/redpajama-cc-2021-04-refine-result.jsonl) (About 284GB).
## Dataset Information
- Number of samples: 44,724,752 (Keep ~45.23% from the original dataset)
## Refining Recipe
```yaml
# global parameters
project_name: 'Data-Juicer-recipes-cc-2021-04'
dataset_path: '/path/to/your/dataset' # path to your dataset directory or file
export_path: '/path/to/your/dataset.jsonl'
np: 50 # number of subprocess to process your dataset
open_tracer: true
# process schedule
# a list of several process operators with their arguments
process:
- document_simhash_deduplicator:
tokenization: space
window_size: 6
lowercase: true
ignore_pattern: '\p{P}'
num_blocks: 6
hamming_distance: 4
- clean_email_mapper:
- clean_links_mapper:
- fix_unicode_mapper:
- punctuation_normalization_mapper:
- whitespace_normalization_mapper:
- alphanumeric_filter:
tokenization: false
min_ratio: 0.7494 # 3sigma
max_ratio: 0.8595 # 3sigma -- 1001790
- average_line_length_filter: # for code
max_len: 1500 # < 3sigma (2817) -- 541131
- character_repetition_filter:
rep_len: 10
max_ratio: 0.3 # > 3sigma (0.1463) -- 159152
- flagged_words_filter:
lang: en
tokenization: true
max_ratio: 0.0019 # 3sigma -- 184714
- language_id_score_filter: # remove language filter
min_score: 0.786 # 3sigma -- 1995115
- maximum_line_length_filter: # for code
max_len: 5000 # < 3sigma -- 1076085
- perplexity_filter:
lang: en
max_ppl: 5000 # < 3sigma -- 906649
- special_characters_filter:
min_ratio: 0.15 # > 3sigma
max_ratio: 0.35 # > 3sigma -- 1046590
- text_length_filter:
max_len: 61592 # 3sigma -- 1114727
- words_num_filter:
lang: en
tokenization: true
min_num: 20 # > 3sigma
max_num: 12241 # 3sigma -- 1120334
- word_repetition_filter:
lang: en
tokenization: true
rep_len: 10
max_ratio: 0.3105 # 3sigma -- 2234933
``` |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_C_HM_A_T_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 1169038
num_examples: 1000
download_size: 0
dataset_size: 1169038
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_C_HM_A_T_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
techiaith/oscar-tts | ---
license: cc0-1.0
dataset_info:
features:
- name: sentence
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 123239430791.0
num_examples: 2323504
- name: test
num_bytes: 121239204.0
num_examples: 2326
download_size: 122442238526
dataset_size: 123360669995.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_176 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1282897360.0
num_examples: 249980
download_size: 1314632595
dataset_size: 1282897360.0
---
# Dataset Card for "chunk_176"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3 | ---
pretty_name: Evaluation run of nlpguy/Hermes-low-tune-3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/Hermes-low-tune-3](https://huggingface.co/nlpguy/Hermes-low-tune-3) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T19:55:30.793353](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3/blob/main/results_2024-01-06T19-55-30.793353.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6402020080388997,\n\
\ \"acc_stderr\": 0.032280751866764705,\n \"acc_norm\": 0.6414551350223837,\n\
\ \"acc_norm_stderr\": 0.03293149995276801,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5793658606194433,\n\
\ \"mc2_stderr\": 0.01538436656194187\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491887,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283507\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6619199362676758,\n\
\ \"acc_stderr\": 0.0047208915971747294,\n \"acc_norm\": 0.8499302927703645,\n\
\ \"acc_norm_stderr\": 0.003564098420387769\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.358659217877095,\n\
\ \"acc_stderr\": 0.016040454426164464,\n \"acc_norm\": 0.358659217877095,\n\
\ \"acc_norm_stderr\": 0.016040454426164464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083135,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083135\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5793658606194433,\n\
\ \"mc2_stderr\": 0.01538436656194187\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249787\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6383623957543594,\n \
\ \"acc_stderr\": 0.013234658351088766\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/Hermes-low-tune-3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|arc:challenge|25_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|gsm8k|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hellaswag|10_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T19-55-30.793353.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T19-55-30.793353.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- '**/details_harness|winogrande|5_2024-01-06T19-55-30.793353.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T19-55-30.793353.parquet'
- config_name: results
data_files:
- split: 2024_01_06T19_55_30.793353
path:
- results_2024-01-06T19-55-30.793353.parquet
- split: latest
path:
- results_2024-01-06T19-55-30.793353.parquet
---
# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/Hermes-low-tune-3](https://huggingface.co/nlpguy/Hermes-low-tune-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T19:55:30.793353](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3/blob/main/results_2024-01-06T19-55-30.793353.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6402020080388997,
"acc_stderr": 0.032280751866764705,
"acc_norm": 0.6414551350223837,
"acc_norm_stderr": 0.03293149995276801,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5793658606194433,
"mc2_stderr": 0.01538436656194187
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491887,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.013822047922283507
},
"harness|hellaswag|10": {
"acc": 0.6619199362676758,
"acc_stderr": 0.0047208915971747294,
"acc_norm": 0.8499302927703645,
"acc_norm_stderr": 0.003564098420387769
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.358659217877095,
"acc_stderr": 0.016040454426164464,
"acc_norm": 0.358659217877095,
"acc_norm_stderr": 0.016040454426164464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083135,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5793658606194433,
"mc2_stderr": 0.01538436656194187
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.011493384687249787
},
"harness|gsm8k|5": {
"acc": 0.6383623957543594,
"acc_stderr": 0.013234658351088766
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
heliosprime/twitter_dataset_1713034166 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11675
num_examples: 27
download_size: 8792
dataset_size: 11675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713034166"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SDbiaseval/jobs-dalle-2 | ---
dataset_info:
features:
- name: adjective
dtype: string
- name: profession
dtype: string
- name: 'no'
dtype: int32
- name: image_path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 24806949244.5
num_examples: 31500
download_size: 18481427451
dataset_size: 24806949244.5
---
# Dataset Card for "dataset-dalle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha | ---
pretty_name: Evaluation run of indischepartij/TinyUltra-4x1.1B-Base-Alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [indischepartij/TinyUltra-4x1.1B-Base-Alpha](https://huggingface.co/indischepartij/TinyUltra-4x1.1B-Base-Alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T23:08:05.664341](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha/blob/main/results_2024-02-01T23-08-05.664341.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26201452650295837,\n\
\ \"acc_stderr\": 0.030950575098959248,\n \"acc_norm\": 0.26190159146597486,\n\
\ \"acc_norm_stderr\": 0.03169834440202644,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731618,\n \"mc2\": 0.3758799861882878,\n\
\ \"mc2_stderr\": 0.014070883279660485\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3447098976109215,\n \"acc_stderr\": 0.01388881628678211,\n\
\ \"acc_norm\": 0.34897610921501704,\n \"acc_norm_stderr\": 0.013928933461382504\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46594303923521213,\n\
\ \"acc_stderr\": 0.004978192893406287,\n \"acc_norm\": 0.6142202748456482,\n\
\ \"acc_norm_stderr\": 0.004857840934549174\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173043,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173043\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n\
\ \"acc_stderr\": 0.024993053397764826,\n \"acc_norm\": 0.26129032258064516,\n\
\ \"acc_norm_stderr\": 0.024993053397764826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782405,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782405\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.02880139219363128,\n \
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.02880139219363128\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425173,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425173\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2886334610472541,\n\
\ \"acc_stderr\": 0.016203792703197804,\n \"acc_norm\": 0.2886334610472541,\n\
\ \"acc_norm_stderr\": 0.016203792703197804\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261445,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261445\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824765,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824765\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.0257700156442904,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.0257700156442904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\
\ \"acc_stderr\": 0.0108859297420022,\n \"acc_norm\": 0.23859191655801826,\n\
\ \"acc_norm_stderr\": 0.0108859297420022\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.16326530612244897,\n \"acc_stderr\": 0.023661699177098622,\n\
\ \"acc_norm\": 0.16326530612244897,\n \"acc_norm_stderr\": 0.023661699177098622\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731618,\n \"mc2\": 0.3758799861882878,\n\
\ \"mc2_stderr\": 0.014070883279660485\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6574585635359116,\n \"acc_stderr\": 0.013337483579075929\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \
\ \"acc_stderr\": 0.004365042953621804\n }\n}\n```"
repo_url: https://huggingface.co/indischepartij/TinyUltra-4x1.1B-Base-Alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|arc:challenge|25_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|gsm8k|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hellaswag|10_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T23-08-05.664341.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- '**/details_harness|winogrande|5_2024-02-01T23-08-05.664341.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T23-08-05.664341.parquet'
- config_name: results
data_files:
- split: 2024_02_01T23_08_05.664341
path:
- results_2024-02-01T23-08-05.664341.parquet
- split: latest
path:
- results_2024-02-01T23-08-05.664341.parquet
---
# Dataset Card for Evaluation run of indischepartij/TinyUltra-4x1.1B-Base-Alpha
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/TinyUltra-4x1.1B-Base-Alpha](https://huggingface.co/indischepartij/TinyUltra-4x1.1B-Base-Alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T23:08:05.664341](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__TinyUltra-4x1.1B-Base-Alpha/blob/main/results_2024-02-01T23-08-05.664341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26201452650295837,
"acc_stderr": 0.030950575098959248,
"acc_norm": 0.26190159146597486,
"acc_norm_stderr": 0.03169834440202644,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731618,
"mc2": 0.3758799861882878,
"mc2_stderr": 0.014070883279660485
},
"harness|arc:challenge|25": {
"acc": 0.3447098976109215,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.34897610921501704,
"acc_norm_stderr": 0.013928933461382504
},
"harness|hellaswag|10": {
"acc": 0.46594303923521213,
"acc_stderr": 0.004978192893406287,
"acc_norm": 0.6142202748456482,
"acc_norm_stderr": 0.004857840934549174
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173043,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173043
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764826,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782405,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782405
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560486,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.02880139219363128,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.02880139219363128
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425173,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425173
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2886334610472541,
"acc_stderr": 0.016203792703197804,
"acc_norm": 0.2886334610472541,
"acc_norm_stderr": 0.016203792703197804
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261445,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261445
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824765,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824765
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.0257700156442904,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.0257700156442904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.0108859297420022,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.0108859297420022
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.01774089950917779,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.01774089950917779
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.16326530612244897,
"acc_stderr": 0.023661699177098622,
"acc_norm": 0.16326530612244897,
"acc_norm_stderr": 0.023661699177098622
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731618,
"mc2": 0.3758799861882878,
"mc2_stderr": 0.014070883279660485
},
"harness|winogrande|5": {
"acc": 0.6574585635359116,
"acc_stderr": 0.013337483579075929
},
"harness|gsm8k|5": {
"acc": 0.02577710386656558,
"acc_stderr": 0.004365042953621804
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tasksource/simple_pair | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: config
dtype: string
splits:
- name: train
num_bytes: 667441
num_examples: 6000
- name: test
num_bytes: 24937007
num_examples: 224000
download_size: 4749047
dataset_size: 25604448
---
# Dataset Card for "simple_pair"
```
@inproceedings{luo-etal-2022-simple-challenging,
title = "Simple but Challenging: Natural Language Inference Models Fail on Simple Sentences",
author = "Luo, Cheng and
Liu, Wei and
Lin, Jieyu and
Zou, Jiajie and
Xiang, Ming and
Ding, Nai",
editor = "Goldberg, Yoav and
Kozareva, Zornitsa and
Zhang, Yue",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2022",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.findings-emnlp.252",
doi = "10.18653/v1/2022.findings-emnlp.252",
pages = "3449--3462",
}
``` |
itamarcard/autoridade | ---
license: openrail
---
|
loaiabdalslam/Scrapping-dataset-llm | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: code
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 248233
num_examples: 7
download_size: 41758
dataset_size: 248233
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maghwa/OpenHermes-2-AR-10K-15-370k-380k | ---
dataset_info:
features:
- name: title
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: topic
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: model_name
dtype: 'null'
- name: source
dtype: string
- name: views
dtype: float64
- name: model
dtype: 'null'
- name: conversations
dtype: string
- name: language
dtype: 'null'
- name: hash
dtype: 'null'
- name: category
dtype: 'null'
- name: idx
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: id
dtype: 'null'
- name: system_prompt
dtype: 'null'
splits:
- name: train
num_bytes: 30562022
num_examples: 10001
download_size: 14196809
dataset_size: 30562022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ccmusic-database/bel_canto | ---
license: mit
task_categories:
- audio-classification
- image-classification
language:
- zh
- en
tags:
- music
- art
pretty_name: Bel Conto and Chinese Folk Song Singing Tech
size_categories:
- 1K<n<10K
viewer: false
---
# Dataset Card for Bel Conto and Chinese Folk Song Singing Tech
The raw dataset contains 203 acapella singing clips (sampled at 22,050 Hz) that are sung in two styles, Bel Conto and Chinese folk singing style by professional vocalists. All of them are sung by professional vocalists and were recorded in professional commercial recording studios. Besides the original version, the pre-processed version is included.
## Usage
### Eval Subset
```python
from datasets import load_dataset
dataset = load_dataset("ccmusic-database/bel_canto", name="eval")
for item in ds["train"]:
print(item)
for item in ds["validation"]:
print(item)
for item in ds["test"]:
print(item)
```
### Raw Subset
```python
from datasets import load_dataset
dataset = load_dataset("ccmusic-database/bel_canto", name="default")
for item in ds["train"]:
print(item)
for item in ds["validation"]:
print(item)
for item in ds["test"]:
print(item)
```
## Maintenance
```bash
GIT_LFS_SKIP_SMUDGE=1 git clone git@hf.co:datasets/ccmusic-database/bel_canto
cd bel_canto
```
## Dataset Description
- **Homepage:** <https://ccmusic-database.github.io>
- **Repository:** <https://huggingface.co/datasets/ccmusic-database/bel_canto>
- **Paper:** <https://doi.org/10.5281/zenodo.5676893>
- **Leaderboard:** <https://ccmusic-database.github.io/team.html>
- **Point of Contact:** <https://www.modelscope.cn/datasets/ccmusic/bel_canto>
### Dataset Summary
This database contains hundreds of acapella singing clips that are sung in two styles, Bel Conto and Chinese national singing style by professional vocalists. All of them are sung by professional vocalists and were recorded in professional commercial recording studios.
### Supported Tasks and Leaderboards
Audio classification, Image classification, singing method classification, voice classification
### Languages
Chinese, English
## Dataset Structure
<style>
.belcanto td {
vertical-align: middle !important;
text-align: center;
}
.belcanto th {
text-align: center;
}
</style>
### Eval Subset
<table class="belcanto">
<tr>
<th>mel<br>(.jpg, 1.6s, 48000Hz)</th>
<th>cqt<br>(.jpg, 1.6s, 48000Hz)</th>
<th>chroma<br>(.jpg, 1.6s, 48000Hz)</th>
<th>label<br>(4-class)</th>
<th>gender<br>(2-class)</th>
<th>singing_method<br>(2-class)</th>
</tr>
<tr>
<td><img src="https://cdn-uploads.huggingface.co/production/uploads/655e0a5b8c2d4379a71882a9/TSTXTg2s2j6gs3O8q_bpD.jpeg"></td>
<td><img src="https://cdn-uploads.huggingface.co/production/uploads/655e0a5b8c2d4379a71882a9/BiuWkk_rkYBfN2hqG60Iy.jpeg"></td>
<td><img src="https://cdn-uploads.huggingface.co/production/uploads/655e0a5b8c2d4379a71882a9/WmcP0UsMe_9lmLmNpAOzr.jpeg"></td>
<td>m_bel, f_bel, m_folk, f_folk</td>
<td>male, female</td>
<td>Folk_Singing, Bel_Canto</td>
</tr>
<tr>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
</tr>
</table>
### Raw Subset
<table>
<tr>
<th>audio(.wav, 22050Hz)</th>
<th>mel(spectrogram, .jpg, 22050Hz)</th>
<th>label(4-class)</th>
<th>gender(2-class)</th>
<th>singing_method(2-class)</th>
</tr>
<tr>
<td><audio controls src="https://huggingface.co/datasets/ccmusic-database/bel_canto/resolve/main/data/%E5%A5%B3%E7%BE%8E%E5%A3%B0%2035.wav"></audio></td>
<td><img src="./data/女美声 35.jpg"></td>
<td>m_bel, f_bel, m_folk, f_folk</td>
<td>male, female</td>
<td>Folk_Singing, Bel_Canto</td>
</tr>
<tr>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
<td>...</td>
</tr>
</table>
### Data Instances
.zip(.wav, .jpg)
### Data Fields
m_bel, f_bel, m_folk, f_folk
### Data Splits
| Split | Eval | Raw |
| :-------------: | :---: | :---: |
| total | 9603 | 203 |
| train(80%) | 7682 | 162 |
| validation(10%) | 960 | 20 |
| test(10%) | 961 | 21 |
## Dataset Creation
### Curation Rationale
Lack of a dataset for Bel Conto and Chinese folk song singing tech
### Source Data
#### Initial Data Collection and Normalization
Zhaorui Liu, Monan Zhou
#### Who are the source language producers?
Students from CCMUSIC
### Annotations
#### Annotation process
All of them are sung by professional vocalists and were recorded in professional commercial recording studios.
#### Who are the annotators?
professional vocalists
### Personal and Sensitive Information
None
## Considerations for Using the Data
### Social Impact of Dataset
Promoting the development of AI in the music industry
### Discussion of Biases
Only for Chinese songs
### Other Known Limitations
Some singers may not have enough professional training in classical or ethnic vocal techniques.
## Additional Information
### Dataset Curators
Zijin Li
### Evaluation
<https://huggingface.co/ccmusic-database/bel_canto>
### Licensing Information
```
MIT License
Copyright (c) CCMUSIC
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
### Citation Information
```bibtex
@dataset{zhaorui_liu_2021_5676893,
author = {Monan Zhou, Shenyang Xu, Zhaorui Liu, Zhaowen Wang, Feng Yu, Wei Li and Baoqiang Han},
title = {CCMusic: an Open and Diverse Database for Chinese and General Music Information Retrieval Research},
month = {mar},
year = {2024},
publisher = {HuggingFace},
version = {1.2},
url = {https://huggingface.co/ccmusic-database}
}
```
### Contributions
Provide a dataset for distinguishing Bel Conto and Chinese folk song singing tech |
joelniklaus/BrCAD-5 | ---
license: cc-by-nc-sa-4.0
---
# Dataset Card for MiningLegalArguments
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [GitHub](https://github.com/eliasjacob/paper_brcad5/)
- **Repository:** [Kaggle](https://www.kaggle.com/datasets/eliasjacob/brcad5)
- **Paper:** [PLOS ONE](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0272287)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@JoelNiklaus](https://github.com/JoelNiklaus) for adding this dataset.
|
irds/msmarco-document_trec-dl-hard_fold4 | ---
pretty_name: '`msmarco-document/trec-dl-hard/fold4`'
viewer: false
source_datasets: ['irds/msmarco-document']
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-document/trec-dl-hard/fold4`
The `msmarco-document/trec-dl-hard/fold4` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-document#msmarco-document/trec-dl-hard/fold4).
# Data
This dataset provides:
- `queries` (i.e., topics); count=10
- `qrels`: (relevance assessments); count=1,054
- For `docs`, use [`irds/msmarco-document`](https://huggingface.co/datasets/irds/msmarco-document)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/msmarco-document_trec-dl-hard_fold4', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/msmarco-document_trec-dl-hard_fold4', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Mackie2021DlHard,
title={How Deep is your Learning: the DL-HARD Annotated Deep Learning Dataset},
author={Iain Mackie and Jeffrey Dalton and Andrew Yates},
journal={ArXiv},
year={2021},
volume={abs/2105.07975}
}
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
Adignite/MSA-llama7b | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1651917
num_examples: 1000
download_size: 615476
dataset_size: 1651917
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anan-2024/twitter_dataset_1713214803 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 86863
num_examples: 247
download_size: 51268
dataset_size: 86863
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
venkatsrini/api_single_4k_truncateright | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: token_type_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence:
sequence: bool
- name: labels
sequence:
sequence: int32
splits:
- name: train
num_bytes: 5298664000
num_examples: 2450
download_size: 276339066
dataset_size: 5298664000
---
# Dataset Card for "api_single_4k_truncateright"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pytorch-survival/nwtco_pycox | ---
dataset_info:
features:
- name: stage
dtype: int64
- name: age
dtype: float32
- name: in.subcohort
dtype: float32
- name: instit_2
dtype: float32
- name: histol_2
dtype: float32
- name: study_4
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: float32
splits:
- name: train
num_bytes: 145008
num_examples: 4028
download_size: 40892
dataset_size: 145008
---
# Dataset Card for "nwtco_pycox"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jamessyx/PathInstruct | ---
license: cc-by-nc-2.0
extra_ated_heading: "Access PathInstruct on Hugging Face"
extra_gated_prompt: "Requests will be processed in 1 business days."
extra_gated_fields:
Country: country
Affiliation: text
Specific date: date_picker
I want to use this dataset for:
type: select
options:
- Research
- Education
- label: Other
value: other
I agree to use this dataset for non-commercial use ONLY: checkbox
I agree to give appropriate cite for the source data: checkbox
---
This is the official Hugging Face repo for **PathInstruct** dataset.
## Citation
```
@article{sun2023pathasst,
title={Pathasst: Redefining pathology through generative foundation ai assistant for pathology},
author={Sun, Yuxuan and Zhu, Chenglu and Zheng, Sunyi and Zhang, Kai and Shui, Zhongyi and Yu, Xiaoxuan and Zhao, Yizhi and Li, Honglin and Zhang, Yunlong and Zhao, Ruojia and others},
journal={arXiv preprint arXiv:2305.15072},
year={2023}
}
```
|
bhavyagiri/imdb-spoiler | ---
license: apache-2.0
---
This is a subset of a [large-dataset](https://www.kaggle.com/datasets/rmisra/imdb-spoiler-dataset) for classifying whether a movie review is a spoiler or not.
It's used to fine-tune [roberta-base](https://huggingface.co/roberta-base) model for Text-Classification Model, [Check it out!](https://huggingface.co/bhavyagiri/roberta-base-finetuned-imdb-spoilers) |
relhousieny/tokenized_lamini_gpt | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2051927
num_examples: 1400
download_size: 676522
dataset_size: 2051927
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ichijou_hotaru_nonnonbiyori | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ichijou Hotaru
This is the dataset of Ichijou Hotaru, containing 299 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 299 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 725 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 807 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 299 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 299 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 299 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 725 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 725 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 613 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 807 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 807 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Leonlav77/Lojc3 | ---
license: apache-2.0
---
|
bourbouh/moroccan-darija-youtube-subtitles | ---
annotations_creators:
- no-annotation
language_creators:
- machine-generated
language:
- ar
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Moroccan Darija YouTube Subtitles
size_categories:
- 0<n<300
source_datasets:
- original
task_categories:
- other
task_ids:
- language-modeling
---
# Moroccan Darija YouTube Subtitles Dataset
This dataset contains subtitles from YouTube videos in Moroccan Darija, a colloquial Arabic dialect spoken in Morocco. The subtitles were collected from several popular Moroccan YouTube channels, providing a diverse set of transcriptions in the Darija language.
## Dataset Description
The dataset is provided as a CSV file, where each row represents a YouTube video and contains the following columns:
- `video_id`: The unique identifier of the YouTube video.
- `title`: The title of the YouTube video.
- `transcript`: The transcript of the video in Moroccan Darija without timestamps.
The subtitles cover a wide range of topics, including entertainment, news, history, and more, offering a comprehensive representation of the Moroccan Darija language as used in YouTube content.
Original transcripts and srt file can be found [here](https://github.com/hbourbouh/bourbouh-moroccan-darija-youtube-subtitles).
## Dataset Structure
The dataset is a single CSV file named `moroccan_darija_subtitles.csv`. The CSV file has the following structure:
```
video_id,title,transcript
video1_id,Video 1 Title,Video 1 Transcript in Moroccan Darija
video2_id,Video 2 Title,Video 2 Transcript in Moroccan Darija
...
```
- The first row of the CSV file contains the column headers: `video_id`, `title`, and `transcript`.
- Each subsequent row represents a YouTube video and its corresponding subtitle information.
## License
This dataset is licensed under the [Creative Commons Attribution 4.0 International License](https://creativecommons.org/licenses/by/4.0/). By using this dataset, you agree to the terms and conditions of the license.
## Contact
Bouaghad, El Hassan \
Bourbouh, Hamza
If you have any questions, suggestions, or issues regarding this dataset, please contact us at hamza@misi.ma.
|
ibivibiv/alpaca_tiny13 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 459971748
num_examples: 290901
download_size: 266296353
dataset_size: 459971748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/f17e5747 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1331
dataset_size: 178
---
# Dataset Card for "f17e5747"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aditya11997/ut-zap-50k | ---
license: mit
---
|
Gooogr/pie_idioms | ---
license: cc-by-4.0
dataset_info:
features:
- name: idiom
dtype: string
- name: is_pie
dtype: bool
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PIE
'2': I-PIE
splits:
- name: train
num_bytes: 82950018
num_examples: 46090
- name: validation
num_bytes: 10420303
num_examples: 5761
- name: test
num_bytes: 10376839
num_examples: 5762
download_size: 19258913
dataset_size: 103747160
task_categories:
- token-classification
language:
- en
tags:
- PIE
- idioms
size_categories:
- 10K<n<100K
pretty_name: Corpus of potentially idiomatic expressions (PIEs)
---
# Dataset Card for PIEs corpus
### Dataset Summary
This corpus is a collection of 57170 potentially idiomatic expressions (PIEs) based on the British National Corpus, prepaired for NER task.
Each of the objects is comes with a contextual set of tokens, BIO tags and boolean label.
The data sources are:
* [MAGPIE corpus](https://github.com/hslh/magpie-corpus)
* [PIE corpus](https://github.com/hslh/pie-annotation)
Detailed data preparation pipeline can be found [here](https://github.com/Gooogr/Idioms_spotter)
### Supported Tasks and Leaderboards
Token classification (NER)
### Languages
English
## Dataset Structure
### Data Instances
For each instance there is a string with target idiom, tokenized by word text with context of idiom usage, corresponded BIO tags
and boolean label `is_pie`. This tag determines whether or not a collocation is considered an idiom in a given context.
For a PIE dataset the choice was determined by the original PIE_label. For MAGPIE a threshold of 0.75 confidence coefficient was chosen.
An example from the train set looks like the following:
```
{'idiom': "go public"
'is_pie': True
'tokens': [ "Private", "dealers", "in", "the", "States", "go", "public" ]
'ner_tags': [ 0, 0, 0, 0, 0, 1, 2 ]
}
```
Where NER tags is {0: 'O', 1: 'B-PIE', 2: 'I-PIE'}
### Data Fields
* idiom: a string containg original PIE
* is_pie: a boolean label determining whether a PIE can be considered an idiom in a given context
* tokens: sequence of word tkenized string with PIE usage context
* ner_tags: corresponded BIO tags for word tokens
### Data Splits
The SNLI dataset has 3 splits: _train_, _validation_, and _test_.
| Dataset Split | Number of Instances in Split |
| ------------- |----------------------------- |
| Train | 45,736 |
| Validation | 5,717 |
| Test | 5,717 |
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
* [MAGPIE corpus](https://github.com/hslh/magpie-corpus)
* [PIE English corpus](https://github.com/hslh/pie-annotation)
## Additional Information
### Licensing Information
Corpus and it's sources are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
### Citation Information
[PIE Corpus](https://github.com/hslh/pie-annotation) (Haagsma, H. (Creator), Bos, J. (Contributor), Plank, B. (Contributor), University of Groningen.)<br>
[MAGPIE: A Large Corpus of Potentially Idiomatic Expressions](https://aclanthology.org/2020.lrec-1.35) (Haagsma et al., LREC 2020) |
micsell/hebrew_kan_sentence40000 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: language
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1881163759.0
num_examples: 10000
download_size: 1880326655
dataset_size: 1881163759.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
2A2I-R/DIBT-Arabic-Dataset_150s | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 87217
num_examples: 150
download_size: 46397
dataset_size: 87217
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pythera/english-mlmcorpus | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 93832106139.0
num_examples: 90584920
download_size: 58728372904
dataset_size: 93832106139.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "english-mlmcorpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-6da44258-8968-4823-8933-3375e1cfee89-64 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
zolak/twitter_dataset_79_1713034039 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6838643
num_examples: 16987
download_size: 3433923
dataset_size: 6838643
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/lmind_nq_train5000_eval5000_v1_recite_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 581636
num_examples: 5000
- name: train_recite_qa
num_bytes: 3790343
num_examples: 5000
- name: eval_qa
num_bytes: 580393
num_examples: 5000
- name: eval_recite_qa
num_bytes: 3785337
num_examples: 5000
- name: all_docs
num_bytes: 5846467
num_examples: 8964
- name: all_docs_eval
num_bytes: 5845967
num_examples: 8964
- name: train
num_bytes: 9636810
num_examples: 13964
- name: validation
num_bytes: 3785337
num_examples: 5000
download_size: 21016479
dataset_size: 33852290
---
# Dataset Card for "lmind_nq_train5000_eval5000_v1_recite_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/704dc3cf | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1340
dataset_size: 182
---
# Dataset Card for "704dc3cf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chenrm/koikatsu-cards | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 43368873054.078
num_examples: 10178
- name: test
num_bytes: 20733059.0
num_examples: 5
download_size: 56731523062
dataset_size: 43389606113.078
---
# Dataset Card for "koikatsu-cards"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/wikiart-resized-sample | ---
dataset_info:
features:
- name: image
dtype: image
- name: artist
dtype:
class_label:
names:
'0': Unknown Artist
'1': boris-kustodiev
'2': camille-pissarro
'3': childe-hassam
'4': claude-monet
'5': edgar-degas
'6': eugene-boudin
'7': gustave-dore
'8': ilya-repin
'9': ivan-aivazovsky
'10': ivan-shishkin
'11': john-singer-sargent
'12': marc-chagall
'13': martiros-saryan
'14': nicholas-roerich
'15': pablo-picasso
'16': paul-cezanne
'17': pierre-auguste-renoir
'18': pyotr-konchalovsky
'19': raphael-kirchner
'20': rembrandt
'21': salvador-dali
'22': vincent-van-gogh
'23': hieronymus-bosch
'24': leonardo-da-vinci
'25': albrecht-durer
'26': edouard-cortes
'27': sam-francis
'28': juan-gris
'29': lucas-cranach-the-elder
'30': paul-gauguin
'31': konstantin-makovsky
'32': egon-schiele
'33': thomas-eakins
'34': gustave-moreau
'35': francisco-goya
'36': edvard-munch
'37': henri-matisse
'38': fra-angelico
'39': maxime-maufra
'40': jan-matejko
'41': mstislav-dobuzhinsky
'42': alfred-sisley
'43': mary-cassatt
'44': gustave-loiseau
'45': fernando-botero
'46': zinaida-serebriakova
'47': georges-seurat
'48': isaac-levitan
'49': joaquãn-sorolla
'50': jacek-malczewski
'51': berthe-morisot
'52': andy-warhol
'53': arkhip-kuindzhi
'54': niko-pirosmani
'55': james-tissot
'56': vasily-polenov
'57': valentin-serov
'58': pietro-perugino
'59': pierre-bonnard
'60': ferdinand-hodler
'61': bartolome-esteban-murillo
'62': giovanni-boldini
'63': henri-martin
'64': gustav-klimt
'65': vasily-perov
'66': odilon-redon
'67': tintoretto
'68': gene-davis
'69': raphael
'70': john-henry-twachtman
'71': henri-de-toulouse-lautrec
'72': antoine-blanchard
'73': david-burliuk
'74': camille-corot
'75': konstantin-korovin
'76': ivan-bilibin
'77': titian
'78': maurice-prendergast
'79': edouard-manet
'80': peter-paul-rubens
'81': aubrey-beardsley
'82': paolo-veronese
'83': joshua-reynolds
'84': kuzma-petrov-vodkin
'85': gustave-caillebotte
'86': lucian-freud
'87': michelangelo
'88': dante-gabriel-rossetti
'89': felix-vallotton
'90': nikolay-bogdanov-belsky
'91': georges-braque
'92': vasily-surikov
'93': fernand-leger
'94': konstantin-somov
'95': katsushika-hokusai
'96': sir-lawrence-alma-tadema
'97': vasily-vereshchagin
'98': ernst-ludwig-kirchner
'99': mikhail-vrubel
'100': orest-kiprensky
'101': william-merritt-chase
'102': aleksey-savrasov
'103': hans-memling
'104': amedeo-modigliani
'105': ivan-kramskoy
'106': utagawa-kuniyoshi
'107': gustave-courbet
'108': william-turner
'109': theo-van-rysselberghe
'110': joseph-wright
'111': edward-burne-jones
'112': koloman-moser
'113': viktor-vasnetsov
'114': anthony-van-dyck
'115': raoul-dufy
'116': frans-hals
'117': hans-holbein-the-younger
'118': ilya-mashkov
'119': henri-fantin-latour
'120': m.c.-escher
'121': el-greco
'122': mikalojus-ciurlionis
'123': james-mcneill-whistler
'124': karl-bryullov
'125': jacob-jordaens
'126': thomas-gainsborough
'127': eugene-delacroix
'128': canaletto
- name: genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
'10': Unknown Genre
- name: style
dtype:
class_label:
names:
'0': Abstract_Expressionism
'1': Action_painting
'2': Analytical_Cubism
'3': Art_Nouveau
'4': Baroque
'5': Color_Field_Painting
'6': Contemporary_Realism
'7': Cubism
'8': Early_Renaissance
'9': Expressionism
'10': Fauvism
'11': High_Renaissance
'12': Impressionism
'13': Mannerism_Late_Renaissance
'14': Minimalism
'15': Naive_Art_Primitivism
'16': New_Realism
'17': Northern_Renaissance
'18': Pointillism
'19': Pop_Art
'20': Post_Impressionism
'21': Realism
'22': Rococo
'23': Romanticism
'24': Symbolism
'25': Synthetic_Cubism
'26': Ukiyo_e
splits:
- name: train
num_bytes: 3110660852.85595
num_examples: 50000
download_size: 3114376026
dataset_size: 3110660852.85595
---
# Dataset Card for "wikiart-resized-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hails/bigbench | ---
dataset_info:
- config_name: abstract_narrative_understanding_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 6560069
num_examples: 3000
- name: train
num_bytes: 5249819
num_examples: 2400
- name: validation
num_bytes: 1310250
num_examples: 600
download_size: 0
dataset_size: 13120138
- config_name: anachronisms_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 48826
num_examples: 230
- name: train
num_bytes: 39116
num_examples: 184
- name: validation
num_bytes: 9710
num_examples: 46
download_size: 0
dataset_size: 97652
- config_name: analogical_similarity_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1373815
num_examples: 323
- name: train
num_bytes: 1101512
num_examples: 259
- name: validation
num_bytes: 272303
num_examples: 64
download_size: 0
dataset_size: 2747630
- config_name: analytic_entailment_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 17316
num_examples: 70
- name: train
num_bytes: 13368
num_examples: 54
- name: validation
num_bytes: 3948
num_examples: 16
download_size: 0
dataset_size: 34632
- config_name: arithmetic_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 3833272
num_examples: 15023
- name: train
num_bytes: 3066775
num_examples: 12019
- name: validation
num_bytes: 766497
num_examples: 3004
download_size: 0
dataset_size: 7666544
- config_name: ascii_word_recognition_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 4984662
num_examples: 5000
- name: train
num_bytes: 3997273
num_examples: 4000
- name: validation
num_bytes: 987389
num_examples: 1000
download_size: 0
dataset_size: 9969324
- config_name: authorship_verification_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 14118592
num_examples: 880
- name: train
num_bytes: 11288481
num_examples: 704
- name: validation
num_bytes: 2830111
num_examples: 176
download_size: 0
dataset_size: 28237184
- config_name: auto_categorization_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 40549
num_examples: 328
- name: train
num_bytes: 32992
num_examples: 263
- name: validation
num_bytes: 7557
num_examples: 65
download_size: 0
dataset_size: 81098
- config_name: auto_debugging_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 5112
num_examples: 34
- name: train
num_bytes: 2651
num_examples: 18
- name: validation
num_bytes: 2461
num_examples: 16
download_size: 0
dataset_size: 10224
- config_name: bbq_lite_json_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 6890493
num_examples: 16076
- name: train
num_bytes: 5508584
num_examples: 12866
- name: validation
num_bytes: 1381909
num_examples: 3210
download_size: 0
dataset_size: 13780986
- config_name: bridging_anaphora_resolution_barqa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1971015
num_examples: 648
- name: train
num_bytes: 1537264
num_examples: 519
- name: validation
num_bytes: 433751
num_examples: 129
download_size: 0
dataset_size: 3942030
- config_name: causal_judgment_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 204878
num_examples: 190
- name: train
num_bytes: 164940
num_examples: 152
- name: validation
num_bytes: 39938
num_examples: 38
download_size: 0
dataset_size: 409756
- config_name: cause_and_effect_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 49314
num_examples: 153
- name: train
num_bytes: 39620
num_examples: 123
- name: validation
num_bytes: 9694
num_examples: 30
download_size: 0
dataset_size: 98628
- config_name: checkmate_in_one_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 3123256
num_examples: 3498
- name: train
num_bytes: 2502314
num_examples: 2799
- name: validation
num_bytes: 620942
num_examples: 699
download_size: 0
dataset_size: 6246512
- config_name: chess_state_tracking_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 3269932
num_examples: 6000
- name: train
num_bytes: 2616294
num_examples: 4800
- name: validation
num_bytes: 653638
num_examples: 1200
download_size: 0
dataset_size: 6539864
- config_name: chinese_remainder_theorem_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 153222
num_examples: 500
- name: train
num_bytes: 122601
num_examples: 400
- name: validation
num_bytes: 30621
num_examples: 100
download_size: 0
dataset_size: 306444
- config_name: cifar10_classification_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 111022200
num_examples: 20000
- name: train
num_bytes: 88782724
num_examples: 16000
- name: validation
num_bytes: 22239476
num_examples: 4000
download_size: 0
dataset_size: 222044400
- config_name: code_line_description_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 33670
num_examples: 60
- name: train
num_bytes: 25530
num_examples: 44
- name: validation
num_bytes: 8140
num_examples: 16
download_size: 0
dataset_size: 67340
- config_name: codenames_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 25195
num_examples: 85
- name: train
num_bytes: 19964
num_examples: 68
- name: validation
num_bytes: 5231
num_examples: 17
download_size: 0
dataset_size: 50390
- config_name: color_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1633263
num_examples: 4000
- name: train
num_bytes: 1306663
num_examples: 3200
- name: validation
num_bytes: 326600
num_examples: 800
download_size: 0
dataset_size: 3266526
- config_name: common_morpheme_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 12388
num_examples: 50
- name: train
num_bytes: 8444
num_examples: 34
- name: validation
num_bytes: 3944
num_examples: 16
download_size: 0
dataset_size: 24776
- config_name: conceptual_combinations_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 58859
num_examples: 103
- name: train
num_bytes: 48010
num_examples: 84
- name: validation
num_bytes: 10849
num_examples: 19
download_size: 0
dataset_size: 117718
- config_name: conlang_translation_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 215190
num_examples: 164
- name: train
num_bytes: 173024
num_examples: 132
- name: validation
num_bytes: 42166
num_examples: 32
download_size: 0
dataset_size: 430380
- config_name: contextual_parametric_knowledge_conflicts_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 14587554
num_examples: 17528
- name: train
num_bytes: 11666236
num_examples: 14023
- name: validation
num_bytes: 2921318
num_examples: 3505
download_size: 0
dataset_size: 29175108
- config_name: crash_blossom_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 12194
num_examples: 38
- name: train
num_bytes: 6999
num_examples: 22
- name: validation
num_bytes: 5195
num_examples: 16
download_size: 0
dataset_size: 24388
- config_name: crass_ai_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 22870
num_examples: 44
- name: train
num_bytes: 14130
num_examples: 28
- name: validation
num_bytes: 8740
num_examples: 16
download_size: 0
dataset_size: 45740
- config_name: cryobiology_spanish_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 38674
num_examples: 146
- name: train
num_bytes: 31129
num_examples: 117
- name: validation
num_bytes: 7545
num_examples: 29
download_size: 0
dataset_size: 77348
- config_name: cryptonite_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 2844402
num_examples: 26157
- name: train
num_bytes: 2275724
num_examples: 20926
- name: validation
num_bytes: 568678
num_examples: 5231
download_size: 0
dataset_size: 5688804
- config_name: cs_algorithms_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 272435
num_examples: 1320
- name: train
num_bytes: 218192
num_examples: 1056
- name: validation
num_bytes: 54243
num_examples: 264
download_size: 0
dataset_size: 544870
- config_name: dark_humor_detection_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 26556
num_examples: 80
- name: train
num_bytes: 21267
num_examples: 64
- name: validation
num_bytes: 5289
num_examples: 16
download_size: 0
dataset_size: 53112
- config_name: date_understanding_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 94908
num_examples: 369
- name: train
num_bytes: 76165
num_examples: 296
- name: validation
num_bytes: 18743
num_examples: 73
download_size: 0
dataset_size: 189816
- config_name: disambiguation_qa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 122471
num_examples: 258
- name: train
num_bytes: 98687
num_examples: 207
- name: validation
num_bytes: 23784
num_examples: 51
download_size: 0
dataset_size: 244942
- config_name: discourse_marker_prediction_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 2090684
num_examples: 857
- name: train
num_bytes: 1666052
num_examples: 686
- name: validation
num_bytes: 424632
num_examples: 171
download_size: 0
dataset_size: 4181368
- config_name: disfl_qa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 7964775
num_examples: 8000
- name: train
num_bytes: 6376511
num_examples: 6400
- name: validation
num_bytes: 1588264
num_examples: 1600
download_size: 0
dataset_size: 15929550
- config_name: dyck_languages_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1227916
num_examples: 1000
- name: train
num_bytes: 982680
num_examples: 800
- name: validation
num_bytes: 245236
num_examples: 200
download_size: 0
dataset_size: 2455832
- config_name: elementary_math_qa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 13442550
num_examples: 38160
- name: train
num_bytes: 10766969
num_examples: 30531
- name: validation
num_bytes: 2675581
num_examples: 7629
download_size: 0
dataset_size: 26885100
- config_name: emoji_movie_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 33667
num_examples: 100
- name: train
num_bytes: 26987
num_examples: 80
- name: validation
num_bytes: 6680
num_examples: 20
download_size: 0
dataset_size: 67334
- config_name: emojis_emotion_prediction_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 47983
num_examples: 131
- name: train
num_bytes: 38458
num_examples: 105
- name: validation
num_bytes: 9525
num_examples: 26
download_size: 0
dataset_size: 95966
- config_name: empirical_judgments_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 47499
num_examples: 99
- name: train
num_bytes: 38346
num_examples: 80
- name: validation
num_bytes: 9153
num_examples: 19
download_size: 0
dataset_size: 94998
- config_name: english_proverbs_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 22530
num_examples: 34
- name: train
num_bytes: 12066
num_examples: 18
- name: validation
num_bytes: 10464
num_examples: 16
download_size: 0
dataset_size: 45060
- config_name: english_russian_proverbs_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 59900
num_examples: 80
- name: train
num_bytes: 48051
num_examples: 64
- name: validation
num_bytes: 11849
num_examples: 16
download_size: 0
dataset_size: 119800
- config_name: entailed_polarity_hindi_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 57052
num_examples: 138
- name: train
num_bytes: 45829
num_examples: 111
- name: validation
num_bytes: 11223
num_examples: 27
download_size: 0
dataset_size: 114104
- config_name: entailed_polarity_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 25421
num_examples: 148
- name: train
num_bytes: 20350
num_examples: 119
- name: validation
num_bytes: 5071
num_examples: 29
download_size: 0
dataset_size: 50842
- config_name: epistemic_reasoning_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 887158
num_examples: 2000
- name: train
num_bytes: 710107
num_examples: 1600
- name: validation
num_bytes: 177051
num_examples: 400
download_size: 0
dataset_size: 1774316
- config_name: evaluating_information_essentiality_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 77488
num_examples: 68
- name: train
num_bytes: 59596
num_examples: 52
- name: validation
num_bytes: 17892
num_examples: 16
download_size: 0
dataset_size: 154976
- config_name: fact_checker_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1337384
num_examples: 7154
- name: train
num_bytes: 1070750
num_examples: 5724
- name: validation
num_bytes: 266634
num_examples: 1430
download_size: 0
dataset_size: 2674768
- config_name: fantasy_reasoning_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 75886
num_examples: 201
- name: train
num_bytes: 61398
num_examples: 161
- name: validation
num_bytes: 14488
num_examples: 40
download_size: 0
dataset_size: 151772
- config_name: few_shot_nlg_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 75937
num_examples: 153
- name: train
num_bytes: 61862
num_examples: 123
- name: validation
num_bytes: 14075
num_examples: 30
download_size: 0
dataset_size: 151874
- config_name: figure_of_speech_detection_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 21717
num_examples: 59
- name: train
num_bytes: 15962
num_examples: 43
- name: validation
num_bytes: 5755
num_examples: 16
download_size: 0
dataset_size: 43434
- config_name: formal_fallacies_syllogisms_negation_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 8314653
num_examples: 14200
- name: train
num_bytes: 6652955
num_examples: 11360
- name: validation
num_bytes: 1661698
num_examples: 2840
download_size: 0
dataset_size: 16629306
- config_name: gem_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 36065281
num_examples: 14802
- name: train
num_bytes: 28819497
num_examples: 11845
- name: validation
num_bytes: 7245784
num_examples: 2957
download_size: 0
dataset_size: 72130562
- config_name: gender_inclusive_sentences_german_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 126881
num_examples: 200
- name: train
num_bytes: 100628
num_examples: 160
- name: validation
num_bytes: 26253
num_examples: 40
download_size: 0
dataset_size: 253762
- config_name: general_knowledge_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 21828
num_examples: 70
- name: train
num_bytes: 16818
num_examples: 54
- name: validation
num_bytes: 5010
num_examples: 16
download_size: 0
dataset_size: 43656
- config_name: geometric_shapes_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 180094
num_examples: 359
- name: train
num_bytes: 144602
num_examples: 288
- name: validation
num_bytes: 35492
num_examples: 71
download_size: 0
dataset_size: 360188
- config_name: goal_step_wikihow_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 3567615
num_examples: 7053
- name: train
num_bytes: 2853871
num_examples: 5643
- name: validation
num_bytes: 713744
num_examples: 1410
download_size: 0
dataset_size: 7135230
- config_name: gre_reading_comprehension_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 94273
num_examples: 31
- name: train
num_bytes: 44458
num_examples: 15
- name: validation
num_bytes: 49815
num_examples: 16
download_size: 0
dataset_size: 188546
- config_name: hhh_alignment_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 272898
num_examples: 221
- name: train
num_bytes: 212488
num_examples: 179
- name: validation
num_bytes: 60410
num_examples: 42
download_size: 0
dataset_size: 545796
- config_name: hindi_question_answering_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 15154954
num_examples: 6610
- name: train
num_bytes: 11983837
num_examples: 5288
- name: validation
num_bytes: 3171117
num_examples: 1322
download_size: 0
dataset_size: 30309908
- config_name: hindu_knowledge_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 44092
num_examples: 175
- name: train
num_bytes: 35392
num_examples: 140
- name: validation
num_bytes: 8700
num_examples: 35
download_size: 0
dataset_size: 88184
- config_name: hinglish_toxicity_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 60613
num_examples: 200
- name: train
num_bytes: 49997
num_examples: 160
- name: validation
num_bytes: 10616
num_examples: 40
download_size: 0
dataset_size: 121226
- config_name: human_organs_senses_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 7944
num_examples: 42
- name: train
num_bytes: 4873
num_examples: 26
- name: validation
num_bytes: 3071
num_examples: 16
download_size: 0
dataset_size: 15888
- config_name: hyperbaton_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 9383986
num_examples: 50000
- name: train
num_bytes: 7509334
num_examples: 40000
- name: validation
num_bytes: 1874652
num_examples: 10000
download_size: 0
dataset_size: 18767972
- config_name: identify_math_theorems_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 104841
num_examples: 53
- name: train
num_bytes: 70295
num_examples: 37
- name: validation
num_bytes: 34546
num_examples: 16
download_size: 0
dataset_size: 209682
- config_name: identify_odd_metaphor_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 27602
num_examples: 47
- name: train
num_bytes: 18138
num_examples: 31
- name: validation
num_bytes: 9464
num_examples: 16
download_size: 0
dataset_size: 55204
- config_name: implicatures_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 91683
num_examples: 492
- name: train
num_bytes: 73416
num_examples: 394
- name: validation
num_bytes: 18267
num_examples: 98
download_size: 0
dataset_size: 183366
- config_name: implicit_relations_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 79710
num_examples: 85
- name: train
num_bytes: 64346
num_examples: 68
- name: validation
num_bytes: 15364
num_examples: 17
download_size: 0
dataset_size: 159420
- config_name: intent_recognition_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 322371
num_examples: 693
- name: train
num_bytes: 257864
num_examples: 555
- name: validation
num_bytes: 64507
num_examples: 138
download_size: 0
dataset_size: 644742
- config_name: international_phonetic_alphabet_nli_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 79320
num_examples: 126
- name: train
num_bytes: 63288
num_examples: 101
- name: validation
num_bytes: 16032
num_examples: 25
download_size: 0
dataset_size: 158640
- config_name: international_phonetic_alphabet_transliterate_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 275938
num_examples: 1003
- name: train
num_bytes: 220784
num_examples: 803
- name: validation
num_bytes: 55154
num_examples: 200
download_size: 0
dataset_size: 551876
- config_name: intersect_geometry_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 211674752
num_examples: 249999
- name: train
num_bytes: 169332898
num_examples: 200000
- name: validation
num_bytes: 42341854
num_examples: 49999
download_size: 0
dataset_size: 423349504
- config_name: irony_identification_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 28178
num_examples: 99
- name: train
num_bytes: 22918
num_examples: 80
- name: validation
num_bytes: 5260
num_examples: 19
download_size: 0
dataset_size: 56356
- config_name: kanji_ascii_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 366946
num_examples: 1092
- name: train
num_bytes: 293933
num_examples: 875
- name: validation
num_bytes: 73013
num_examples: 217
download_size: 0
dataset_size: 733892
- config_name: kannada_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 140638
num_examples: 316
- name: train
num_bytes: 111865
num_examples: 253
- name: validation
num_bytes: 28773
num_examples: 63
download_size: 0
dataset_size: 281276
- config_name: key_value_maps_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 105136
num_examples: 101
- name: train
num_bytes: 84317
num_examples: 80
- name: validation
num_bytes: 20819
num_examples: 21
download_size: 0
dataset_size: 210272
- config_name: known_unknowns_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 7960
num_examples: 46
- name: train
num_bytes: 5130
num_examples: 30
- name: validation
num_bytes: 2830
num_examples: 16
download_size: 0
dataset_size: 15920
- config_name: language_games_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 979619
num_examples: 2128
- name: train
num_bytes: 783111
num_examples: 1704
- name: validation
num_bytes: 196508
num_examples: 424
download_size: 0
dataset_size: 1959238
- config_name: language_identification_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 7376223
num_examples: 10000
- name: train
num_bytes: 5908808
num_examples: 8000
- name: validation
num_bytes: 1467415
num_examples: 2000
download_size: 0
dataset_size: 14752446
- config_name: linguistic_mappings_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1325186
num_examples: 15527
- name: train
num_bytes: 1060088
num_examples: 12426
- name: validation
num_bytes: 265098
num_examples: 3101
download_size: 0
dataset_size: 2650372
- config_name: linguistics_puzzles_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1746024
num_examples: 2000
- name: train
num_bytes: 1398113
num_examples: 1600
- name: validation
num_bytes: 347911
num_examples: 400
download_size: 0
dataset_size: 3492048
- config_name: list_functions_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 2678136
num_examples: 10750
- name: train
num_bytes: 2161065
num_examples: 8700
- name: validation
num_bytes: 517071
num_examples: 2050
download_size: 0
dataset_size: 5356272
- config_name: logic_grid_puzzle_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1456218
num_examples: 1000
- name: train
num_bytes: 1160137
num_examples: 800
- name: validation
num_bytes: 296081
num_examples: 200
download_size: 0
dataset_size: 2912436
- config_name: logical_args_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 43582
num_examples: 32
- name: train
num_bytes: 21072
num_examples: 16
- name: validation
num_bytes: 22510
num_examples: 16
download_size: 0
dataset_size: 87164
- config_name: logical_deduction_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1056716
num_examples: 1500
- name: train
num_bytes: 841788
num_examples: 1200
- name: validation
num_bytes: 214928
num_examples: 300
download_size: 0
dataset_size: 2113432
- config_name: logical_fallacy_detection_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 720286
num_examples: 2800
- name: train
num_bytes: 576295
num_examples: 2240
- name: validation
num_bytes: 143991
num_examples: 560
download_size: 0
dataset_size: 1440572
- config_name: logical_sequence_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 22722
num_examples: 39
- name: train
num_bytes: 12648
num_examples: 23
- name: validation
num_bytes: 10074
num_examples: 16
download_size: 0
dataset_size: 45444
- config_name: mathematical_induction_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 19018
num_examples: 69
- name: train
num_bytes: 14983
num_examples: 53
- name: validation
num_bytes: 4035
num_examples: 16
download_size: 0
dataset_size: 38036
- config_name: matrixshapes_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1130574
num_examples: 4462
- name: train
num_bytes: 906061
num_examples: 3570
- name: validation
num_bytes: 224513
num_examples: 892
download_size: 0
dataset_size: 2261148
- config_name: metaphor_boolean_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 213848
num_examples: 680
- name: train
num_bytes: 170765
num_examples: 544
- name: validation
num_bytes: 43083
num_examples: 136
download_size: 0
dataset_size: 427696
- config_name: metaphor_understanding_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 200862
num_examples: 234
- name: train
num_bytes: 162101
num_examples: 188
- name: validation
num_bytes: 38761
num_examples: 46
download_size: 0
dataset_size: 401724
- config_name: minute_mysteries_qa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 3245190
num_examples: 477
- name: train
num_bytes: 2623703
num_examples: 383
- name: validation
num_bytes: 621487
num_examples: 94
download_size: 0
dataset_size: 6490380
- config_name: misconceptions_russian_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 16991
num_examples: 49
- name: train
num_bytes: 10970
num_examples: 33
- name: validation
num_bytes: 6021
num_examples: 16
download_size: 0
dataset_size: 33982
- config_name: misconceptions_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 45816
num_examples: 219
- name: train
num_bytes: 37246
num_examples: 176
- name: validation
num_bytes: 8570
num_examples: 43
download_size: 0
dataset_size: 91632
- config_name: mnist_ascii_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 61739808
num_examples: 69984
- name: train
num_bytes: 49419928
num_examples: 55988
- name: validation
num_bytes: 12319880
num_examples: 13996
download_size: 0
dataset_size: 123479616
- config_name: modified_arithmetic_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1220993
num_examples: 6000
- name: train
num_bytes: 976859
num_examples: 4800
- name: validation
num_bytes: 244134
num_examples: 1200
download_size: 0
dataset_size: 2441986
- config_name: moral_permissibility_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 162068
num_examples: 342
- name: train
num_bytes: 128790
num_examples: 274
- name: validation
num_bytes: 33278
num_examples: 68
download_size: 0
dataset_size: 324136
- config_name: movie_dialog_same_or_different_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 28645997
num_examples: 50000
- name: train
num_bytes: 22889061
num_examples: 40000
- name: validation
num_bytes: 5756936
num_examples: 10000
download_size: 0
dataset_size: 57291994
- config_name: movie_recommendation_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 173557
num_examples: 500
- name: train
num_bytes: 138936
num_examples: 400
- name: validation
num_bytes: 34621
num_examples: 100
download_size: 0
dataset_size: 347114
- config_name: mult_data_wrangling_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 625422
num_examples: 7854
- name: train
num_bytes: 507838
num_examples: 6380
- name: validation
num_bytes: 117584
num_examples: 1474
download_size: 0
dataset_size: 1250844
- config_name: multiemo_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 650173925
num_examples: 1437281
- name: train
num_bytes: 520172185
num_examples: 1149873
- name: validation
num_bytes: 130001740
num_examples: 287408
download_size: 0
dataset_size: 1300347850
- config_name: natural_instructions_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 355938370
num_examples: 193250
- name: train
num_bytes: 284920096
num_examples: 154615
- name: validation
num_bytes: 71018274
num_examples: 38635
download_size: 0
dataset_size: 711876740
- config_name: navigate_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 225813
num_examples: 1000
- name: train
num_bytes: 180958
num_examples: 800
- name: validation
num_bytes: 44855
num_examples: 200
download_size: 83744
dataset_size: 451626
- config_name: nonsense_words_grammar_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 11102
num_examples: 50
- name: train
num_bytes: 7582
num_examples: 34
- name: validation
num_bytes: 3520
num_examples: 16
download_size: 24107
dataset_size: 22204
- config_name: novel_concepts_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 16065
num_examples: 32
- name: train
num_bytes: 8128
num_examples: 16
- name: validation
num_bytes: 7937
num_examples: 16
download_size: 25919
dataset_size: 32130
- config_name: object_counting_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 149555
num_examples: 1000
- name: train
num_bytes: 119609
num_examples: 800
- name: validation
num_bytes: 29946
num_examples: 200
download_size: 91852
dataset_size: 299110
- config_name: odd_one_out_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 13843
num_examples: 86
- name: train
num_bytes: 11217
num_examples: 69
- name: validation
num_bytes: 2626
num_examples: 17
download_size: 25796
dataset_size: 27686
- config_name: operators_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 32435
num_examples: 210
- name: train
num_bytes: 25937
num_examples: 168
- name: validation
num_bytes: 6498
num_examples: 42
download_size: 24728
dataset_size: 64870
- config_name: paragraph_segmentation_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 56846507
num_examples: 9000
- name: train
num_bytes: 45674320
num_examples: 7200
- name: validation
num_bytes: 11172187
num_examples: 1800
download_size: 61123049
dataset_size: 113693014
- config_name: parsinlu_qa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 456189
num_examples: 1050
- name: train
num_bytes: 366577
num_examples: 840
- name: validation
num_bytes: 89612
num_examples: 210
download_size: 465963
dataset_size: 912378
- config_name: parsinlu_reading_comprehension_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 573798
num_examples: 518
- name: train
num_bytes: 455828
num_examples: 415
- name: validation
num_bytes: 117970
num_examples: 103
download_size: 572992
dataset_size: 1147596
- config_name: penguins_in_a_table_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 75985
num_examples: 149
- name: train
num_bytes: 61321
num_examples: 120
- name: validation
num_bytes: 14664
num_examples: 29
download_size: 32039
dataset_size: 151970
- config_name: periodic_elements_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 285204
num_examples: 654
- name: train
num_bytes: 229481
num_examples: 524
- name: validation
num_bytes: 55723
num_examples: 130
download_size: 41084
dataset_size: 570408
- config_name: persian_idioms_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 28592
num_examples: 66
- name: train
num_bytes: 21684
num_examples: 50
- name: validation
num_bytes: 6908
num_examples: 16
download_size: 34341
dataset_size: 57184
- config_name: phrase_relatedness_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 30190
num_examples: 100
- name: train
num_bytes: 23773
num_examples: 80
- name: validation
num_bytes: 6417
num_examples: 20
download_size: 40334
dataset_size: 60380
- config_name: physical_intuition_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 23734
num_examples: 81
- name: train
num_bytes: 19307
num_examples: 65
- name: validation
num_bytes: 4427
num_examples: 16
download_size: 28462
dataset_size: 47468
- config_name: physics_questions_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 18372
num_examples: 54
- name: train
num_bytes: 13402
num_examples: 38
- name: validation
num_bytes: 4970
num_examples: 16
download_size: 35187
dataset_size: 36744
- config_name: physics_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 120239
num_examples: 229
- name: train
num_bytes: 96122
num_examples: 184
- name: validation
num_bytes: 24117
num_examples: 45
download_size: 69885
dataset_size: 240478
- config_name: play_dialog_same_or_different_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 3142468
num_examples: 3264
- name: train
num_bytes: 2516052
num_examples: 2612
- name: validation
num_bytes: 626416
num_examples: 652
download_size: 1710264
dataset_size: 6284936
- config_name: polish_sequence_labeling_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 18081112
num_examples: 12812
- name: train
num_bytes: 14470720
num_examples: 10250
- name: validation
num_bytes: 3610392
num_examples: 2562
download_size: 5242934
dataset_size: 36162224
- config_name: presuppositions_as_nli_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 502522
num_examples: 735
- name: train
num_bytes: 400761
num_examples: 588
- name: validation
num_bytes: 101761
num_examples: 147
download_size: 240065
dataset_size: 1005044
- config_name: qa_wikidata_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1419042
num_examples: 20321
- name: train
num_bytes: 1134918
num_examples: 16257
- name: validation
num_bytes: 284124
num_examples: 4064
download_size: 1181835
dataset_size: 2838084
- config_name: question_selection_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 2487181
num_examples: 1582
- name: train
num_bytes: 1990094
num_examples: 1266
- name: validation
num_bytes: 497087
num_examples: 316
download_size: 1804283
dataset_size: 4974362
- config_name: real_or_fake_text_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 53663318
num_examples: 15088
- name: train
num_bytes: 42879846
num_examples: 12072
- name: validation
num_bytes: 10783472
num_examples: 3016
download_size: 47399045
dataset_size: 107326636
- config_name: reasoning_about_colored_objects_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 907474
num_examples: 2000
- name: train
num_bytes: 729609
num_examples: 1600
- name: validation
num_bytes: 177865
num_examples: 400
download_size: 273263
dataset_size: 1814948
- config_name: repeat_copy_logic_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 6678
num_examples: 32
- name: train
num_bytes: 3327
num_examples: 16
- name: validation
num_bytes: 3351
num_examples: 16
download_size: 18315
dataset_size: 13356
- config_name: rephrase_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 34222
num_examples: 78
- name: train
num_bytes: 27360
num_examples: 62
- name: validation
num_bytes: 6862
num_examples: 16
download_size: 41102
dataset_size: 68444
- config_name: riddle_sense_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 15507
num_examples: 49
- name: train
num_bytes: 10741
num_examples: 33
- name: validation
num_bytes: 4766
num_examples: 16
download_size: 32496
dataset_size: 31014
- config_name: ruin_names_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 144087
num_examples: 448
- name: train
num_bytes: 115171
num_examples: 359
- name: validation
num_bytes: 28916
num_examples: 89
download_size: 118193
dataset_size: 288174
- config_name: salient_translation_error_detection_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1141626
num_examples: 998
- name: train
num_bytes: 912819
num_examples: 799
- name: validation
num_bytes: 228807
num_examples: 199
download_size: 413634
dataset_size: 2283252
- config_name: scientific_press_release_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 13690
num_examples: 50
- name: train
num_bytes: 9254
num_examples: 34
- name: validation
num_bytes: 4436
num_examples: 16
download_size: 27293
dataset_size: 27380
- config_name: semantic_parsing_in_context_sparc_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1524852
num_examples: 1155
- name: train
num_bytes: 1248391
num_examples: 924
- name: validation
num_bytes: 276461
num_examples: 231
download_size: 440326
dataset_size: 3049704
- config_name: semantic_parsing_spider_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1265744
num_examples: 1034
- name: train
num_bytes: 973864
num_examples: 828
- name: validation
num_bytes: 291880
num_examples: 206
download_size: 358276
dataset_size: 2531488
- config_name: sentence_ambiguity_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 8168
num_examples: 60
- name: train
num_bytes: 5976
num_examples: 44
- name: validation
num_bytes: 2192
num_examples: 16
download_size: 18275
dataset_size: 16336
- config_name: similarities_abstraction_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 23416
num_examples: 76
- name: train
num_bytes: 18545
num_examples: 60
- name: validation
num_bytes: 4871
num_examples: 16
download_size: 31521
dataset_size: 46832
- config_name: simp_turing_concept_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1017646
num_examples: 6390
- name: train
num_bytes: 813220
num_examples: 5112
- name: validation
num_bytes: 204426
num_examples: 1278
download_size: 402574
dataset_size: 2035292
- config_name: simple_arithmetic_json_multiple_choice_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 924
num_examples: 8
- name: train
num_bytes: 0
num_examples: 0
- name: validation
num_bytes: 0
num_examples: 0
download_size: 7777
dataset_size: 924
- config_name: simple_arithmetic_json_subtasks_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1145
num_examples: 30
- name: train
num_bytes: 571
num_examples: 15
- name: validation
num_bytes: 574
num_examples: 15
download_size: 10460
dataset_size: 2290
- config_name: simple_arithmetic_json_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1145
num_examples: 30
- name: train
num_bytes: 540
num_examples: 14
- name: validation
num_bytes: 605
num_examples: 16
download_size: 10645
dataset_size: 2290
- config_name: simple_arithmetic_multiple_targets_json_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 414
num_examples: 10
- name: train
num_bytes: 0
num_examples: 0
- name: validation
num_bytes: 0
num_examples: 0
download_size: 7352
dataset_size: 414
- config_name: simple_ethical_questions_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 76518
num_examples: 115
- name: train
num_bytes: 60275
num_examples: 92
- name: validation
num_bytes: 16243
num_examples: 23
download_size: 81285
dataset_size: 153036
- config_name: simple_text_editing_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 27865
num_examples: 47
- name: train
num_bytes: 18469
num_examples: 31
- name: validation
num_bytes: 9396
num_examples: 16
download_size: 27100
dataset_size: 55730
- config_name: snarks_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 45717
num_examples: 181
- name: train
num_bytes: 36989
num_examples: 145
- name: validation
num_bytes: 8728
num_examples: 36
download_size: 45434
dataset_size: 91434
- config_name: social_iqa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 643162
num_examples: 1935
- name: train
num_bytes: 515686
num_examples: 1548
- name: validation
num_bytes: 127476
num_examples: 387
download_size: 684043
dataset_size: 1286324
- config_name: social_support_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 366705
num_examples: 897
- name: train
num_bytes: 294793
num_examples: 718
- name: validation
num_bytes: 71912
num_examples: 179
download_size: 288867
dataset_size: 733410
- config_name: sports_understanding_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 226654
num_examples: 986
- name: train
num_bytes: 181328
num_examples: 789
- name: validation
num_bytes: 45326
num_examples: 197
download_size: 82415
dataset_size: 453308
- config_name: strange_stories_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 120500
num_examples: 174
- name: train
num_bytes: 98055
num_examples: 140
- name: validation
num_bytes: 22445
num_examples: 34
download_size: 106428
dataset_size: 241000
- config_name: strategyqa_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 659967
num_examples: 2289
- name: train
num_bytes: 527670
num_examples: 1832
- name: validation
num_bytes: 132297
num_examples: 457
download_size: 814405
dataset_size: 1319934
- config_name: sufficient_information_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 9425
num_examples: 39
- name: train
num_bytes: 5594
num_examples: 23
- name: validation
num_bytes: 3831
num_examples: 16
download_size: 17766
dataset_size: 18850
- config_name: suicide_risk_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 37952
num_examples: 40
- name: train
num_bytes: 23067
num_examples: 24
- name: validation
num_bytes: 14885
num_examples: 16
download_size: 60518
dataset_size: 75904
- config_name: swahili_english_proverbs_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 90246
num_examples: 153
- name: train
num_bytes: 72467
num_examples: 123
- name: validation
num_bytes: 17779
num_examples: 30
download_size: 95186
dataset_size: 180492
- config_name: swedish_to_german_proverbs_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 35204
num_examples: 72
- name: train
num_bytes: 27266
num_examples: 56
- name: validation
num_bytes: 7938
num_examples: 16
download_size: 55102
dataset_size: 70408
- config_name: symbol_interpretation_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1148958
num_examples: 990
- name: train
num_bytes: 927326
num_examples: 795
- name: validation
num_bytes: 221632
num_examples: 195
download_size: 320412
dataset_size: 2297916
- config_name: temporal_sequences_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 687086
num_examples: 1000
- name: train
num_bytes: 549808
num_examples: 800
- name: validation
num_bytes: 137278
num_examples: 200
download_size: 295316
dataset_size: 1374172
- config_name: tense_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 43882
num_examples: 286
- name: train
num_bytes: 35466
num_examples: 229
- name: validation
num_bytes: 8416
num_examples: 57
download_size: 51466
dataset_size: 87764
- config_name: timedial_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 2763178
num_examples: 2550
- name: train
num_bytes: 2217190
num_examples: 2040
- name: validation
num_bytes: 545988
num_examples: 510
download_size: 2444115
dataset_size: 5526356
- config_name: topical_chat_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 30927758
num_examples: 22295
- name: train
num_bytes: 24827254
num_examples: 17836
- name: validation
num_bytes: 6100504
num_examples: 4459
download_size: 23505731
dataset_size: 61855516
- config_name: tracking_shuffled_objects_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 2775972
num_examples: 3750
- name: train
num_bytes: 2224037
num_examples: 3000
- name: validation
num_bytes: 551935
num_examples: 750
download_size: 738413
dataset_size: 5551944
- config_name: understanding_fables_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 227748
num_examples: 189
- name: train
num_bytes: 181000
num_examples: 152
- name: validation
num_bytes: 46748
num_examples: 37
download_size: 237036
dataset_size: 455496
- config_name: undo_permutation_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 196118
num_examples: 300
- name: train
num_bytes: 158562
num_examples: 240
- name: validation
num_bytes: 37556
num_examples: 60
download_size: 137204
dataset_size: 392236
- config_name: unit_conversion_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 4028628
num_examples: 23936
- name: train
num_bytes: 3230357
num_examples: 19151
- name: validation
num_bytes: 798271
num_examples: 4785
download_size: 3208622
dataset_size: 8057256
- config_name: unit_interpretation_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 37363
num_examples: 100
- name: train
num_bytes: 29939
num_examples: 80
- name: validation
num_bytes: 7424
num_examples: 20
download_size: 34926
dataset_size: 74726
- config_name: unnatural_in_context_learning_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 4599760
num_examples: 73420
- name: train
num_bytes: 3679822
num_examples: 58736
- name: validation
num_bytes: 919938
num_examples: 14684
download_size: 3840657
dataset_size: 9199520
- config_name: vitaminc_fact_verification_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 32361818
num_examples: 54668
- name: train
num_bytes: 25889850
num_examples: 43735
- name: validation
num_bytes: 6471968
num_examples: 10933
download_size: 14264790
dataset_size: 64723636
- config_name: what_is_the_tao_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 13268
num_examples: 36
- name: train
num_bytes: 7435
num_examples: 20
- name: validation
num_bytes: 5833
num_examples: 16
download_size: 27585
dataset_size: 26536
- config_name: which_wiki_edit_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 6331683
num_examples: 571
- name: train
num_bytes: 5233870
num_examples: 457
- name: validation
num_bytes: 1097813
num_examples: 114
download_size: 3914574
dataset_size: 12663366
- config_name: winowhy_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 1002434
num_examples: 2862
- name: train
num_bytes: 800520
num_examples: 2290
- name: validation
num_bytes: 201914
num_examples: 572
download_size: 449218
dataset_size: 2004868
- config_name: word_sorting_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 491054
num_examples: 1900
- name: train
num_bytes: 392738
num_examples: 1520
- name: validation
num_bytes: 98316
num_examples: 380
download_size: 641536
dataset_size: 982108
- config_name: word_unscrambling_zero_shot
features:
- name: idx
dtype: int32
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
splits:
- name: default
num_bytes: 882364
num_examples: 8917
- name: train
num_bytes: 705755
num_examples: 7134
- name: validation
num_bytes: 176609
num_examples: 1783
download_size: 563799
dataset_size: 1764728
configs:
- config_name: abstract_narrative_understanding_zero_shot
data_files:
- split: default
path: abstract_narrative_understanding_zero_shot/default-*
- split: train
path: abstract_narrative_understanding_zero_shot/train-*
- split: validation
path: abstract_narrative_understanding_zero_shot/validation-*
- config_name: anachronisms_zero_shot
data_files:
- split: default
path: anachronisms_zero_shot/default-*
- split: train
path: anachronisms_zero_shot/train-*
- split: validation
path: anachronisms_zero_shot/validation-*
- config_name: analogical_similarity_zero_shot
data_files:
- split: default
path: analogical_similarity_zero_shot/default-*
- split: train
path: analogical_similarity_zero_shot/train-*
- split: validation
path: analogical_similarity_zero_shot/validation-*
- config_name: analytic_entailment_zero_shot
data_files:
- split: default
path: analytic_entailment_zero_shot/default-*
- split: train
path: analytic_entailment_zero_shot/train-*
- split: validation
path: analytic_entailment_zero_shot/validation-*
- config_name: arithmetic_zero_shot
data_files:
- split: default
path: arithmetic_zero_shot/default-*
- split: train
path: arithmetic_zero_shot/train-*
- split: validation
path: arithmetic_zero_shot/validation-*
- config_name: ascii_word_recognition_zero_shot
data_files:
- split: default
path: ascii_word_recognition_zero_shot/default-*
- split: train
path: ascii_word_recognition_zero_shot/train-*
- split: validation
path: ascii_word_recognition_zero_shot/validation-*
- config_name: authorship_verification_zero_shot
data_files:
- split: default
path: authorship_verification_zero_shot/default-*
- split: train
path: authorship_verification_zero_shot/train-*
- split: validation
path: authorship_verification_zero_shot/validation-*
- config_name: auto_categorization_zero_shot
data_files:
- split: default
path: auto_categorization_zero_shot/default-*
- split: train
path: auto_categorization_zero_shot/train-*
- split: validation
path: auto_categorization_zero_shot/validation-*
- config_name: auto_debugging_zero_shot
data_files:
- split: default
path: auto_debugging_zero_shot/default-*
- split: train
path: auto_debugging_zero_shot/train-*
- split: validation
path: auto_debugging_zero_shot/validation-*
- config_name: bbq_lite_json_zero_shot
data_files:
- split: default
path: bbq_lite_json_zero_shot/default-*
- split: train
path: bbq_lite_json_zero_shot/train-*
- split: validation
path: bbq_lite_json_zero_shot/validation-*
- config_name: bridging_anaphora_resolution_barqa_zero_shot
data_files:
- split: default
path: bridging_anaphora_resolution_barqa_zero_shot/default-*
- split: train
path: bridging_anaphora_resolution_barqa_zero_shot/train-*
- split: validation
path: bridging_anaphora_resolution_barqa_zero_shot/validation-*
- config_name: causal_judgment_zero_shot
data_files:
- split: default
path: causal_judgment_zero_shot/default-*
- split: train
path: causal_judgment_zero_shot/train-*
- split: validation
path: causal_judgment_zero_shot/validation-*
- config_name: cause_and_effect_zero_shot
data_files:
- split: default
path: cause_and_effect_zero_shot/default-*
- split: train
path: cause_and_effect_zero_shot/train-*
- split: validation
path: cause_and_effect_zero_shot/validation-*
- config_name: checkmate_in_one_zero_shot
data_files:
- split: default
path: checkmate_in_one_zero_shot/default-*
- split: train
path: checkmate_in_one_zero_shot/train-*
- split: validation
path: checkmate_in_one_zero_shot/validation-*
- config_name: chess_state_tracking_zero_shot
data_files:
- split: default
path: chess_state_tracking_zero_shot/default-*
- split: train
path: chess_state_tracking_zero_shot/train-*
- split: validation
path: chess_state_tracking_zero_shot/validation-*
- config_name: chinese_remainder_theorem_zero_shot
data_files:
- split: default
path: chinese_remainder_theorem_zero_shot/default-*
- split: train
path: chinese_remainder_theorem_zero_shot/train-*
- split: validation
path: chinese_remainder_theorem_zero_shot/validation-*
- config_name: cifar10_classification_zero_shot
data_files:
- split: default
path: cifar10_classification_zero_shot/default-*
- split: train
path: cifar10_classification_zero_shot/train-*
- split: validation
path: cifar10_classification_zero_shot/validation-*
- config_name: code_line_description_zero_shot
data_files:
- split: default
path: code_line_description_zero_shot/default-*
- split: train
path: code_line_description_zero_shot/train-*
- split: validation
path: code_line_description_zero_shot/validation-*
- config_name: codenames_zero_shot
data_files:
- split: default
path: codenames_zero_shot/default-*
- split: train
path: codenames_zero_shot/train-*
- split: validation
path: codenames_zero_shot/validation-*
- config_name: color_zero_shot
data_files:
- split: default
path: color_zero_shot/default-*
- split: train
path: color_zero_shot/train-*
- split: validation
path: color_zero_shot/validation-*
- config_name: common_morpheme_zero_shot
data_files:
- split: default
path: common_morpheme_zero_shot/default-*
- split: train
path: common_morpheme_zero_shot/train-*
- split: validation
path: common_morpheme_zero_shot/validation-*
- config_name: conceptual_combinations_zero_shot
data_files:
- split: default
path: conceptual_combinations_zero_shot/default-*
- split: train
path: conceptual_combinations_zero_shot/train-*
- split: validation
path: conceptual_combinations_zero_shot/validation-*
- config_name: conlang_translation_zero_shot
data_files:
- split: default
path: conlang_translation_zero_shot/default-*
- split: train
path: conlang_translation_zero_shot/train-*
- split: validation
path: conlang_translation_zero_shot/validation-*
- config_name: contextual_parametric_knowledge_conflicts_zero_shot
data_files:
- split: default
path: contextual_parametric_knowledge_conflicts_zero_shot/default-*
- split: train
path: contextual_parametric_knowledge_conflicts_zero_shot/train-*
- split: validation
path: contextual_parametric_knowledge_conflicts_zero_shot/validation-*
- config_name: crash_blossom_zero_shot
data_files:
- split: default
path: crash_blossom_zero_shot/default-*
- split: train
path: crash_blossom_zero_shot/train-*
- split: validation
path: crash_blossom_zero_shot/validation-*
- config_name: crass_ai_zero_shot
data_files:
- split: default
path: crass_ai_zero_shot/default-*
- split: train
path: crass_ai_zero_shot/train-*
- split: validation
path: crass_ai_zero_shot/validation-*
- config_name: cryobiology_spanish_zero_shot
data_files:
- split: default
path: cryobiology_spanish_zero_shot/default-*
- split: train
path: cryobiology_spanish_zero_shot/train-*
- split: validation
path: cryobiology_spanish_zero_shot/validation-*
- config_name: cryptonite_zero_shot
data_files:
- split: default
path: cryptonite_zero_shot/default-*
- split: train
path: cryptonite_zero_shot/train-*
- split: validation
path: cryptonite_zero_shot/validation-*
- config_name: cs_algorithms_zero_shot
data_files:
- split: default
path: cs_algorithms_zero_shot/default-*
- split: train
path: cs_algorithms_zero_shot/train-*
- split: validation
path: cs_algorithms_zero_shot/validation-*
- config_name: dark_humor_detection_zero_shot
data_files:
- split: default
path: dark_humor_detection_zero_shot/default-*
- split: train
path: dark_humor_detection_zero_shot/train-*
- split: validation
path: dark_humor_detection_zero_shot/validation-*
- config_name: date_understanding_zero_shot
data_files:
- split: default
path: date_understanding_zero_shot/default-*
- split: train
path: date_understanding_zero_shot/train-*
- split: validation
path: date_understanding_zero_shot/validation-*
- config_name: disambiguation_qa_zero_shot
data_files:
- split: default
path: disambiguation_qa_zero_shot/default-*
- split: train
path: disambiguation_qa_zero_shot/train-*
- split: validation
path: disambiguation_qa_zero_shot/validation-*
- config_name: discourse_marker_prediction_zero_shot
data_files:
- split: default
path: discourse_marker_prediction_zero_shot/default-*
- split: train
path: discourse_marker_prediction_zero_shot/train-*
- split: validation
path: discourse_marker_prediction_zero_shot/validation-*
- config_name: disfl_qa_zero_shot
data_files:
- split: default
path: disfl_qa_zero_shot/default-*
- split: train
path: disfl_qa_zero_shot/train-*
- split: validation
path: disfl_qa_zero_shot/validation-*
- config_name: dyck_languages_zero_shot
data_files:
- split: default
path: dyck_languages_zero_shot/default-*
- split: train
path: dyck_languages_zero_shot/train-*
- split: validation
path: dyck_languages_zero_shot/validation-*
- config_name: elementary_math_qa_zero_shot
data_files:
- split: default
path: elementary_math_qa_zero_shot/default-*
- split: train
path: elementary_math_qa_zero_shot/train-*
- split: validation
path: elementary_math_qa_zero_shot/validation-*
- config_name: emoji_movie_zero_shot
data_files:
- split: default
path: emoji_movie_zero_shot/default-*
- split: train
path: emoji_movie_zero_shot/train-*
- split: validation
path: emoji_movie_zero_shot/validation-*
- config_name: emojis_emotion_prediction_zero_shot
data_files:
- split: default
path: emojis_emotion_prediction_zero_shot/default-*
- split: train
path: emojis_emotion_prediction_zero_shot/train-*
- split: validation
path: emojis_emotion_prediction_zero_shot/validation-*
- config_name: empirical_judgments_zero_shot
data_files:
- split: default
path: empirical_judgments_zero_shot/default-*
- split: train
path: empirical_judgments_zero_shot/train-*
- split: validation
path: empirical_judgments_zero_shot/validation-*
- config_name: english_proverbs_zero_shot
data_files:
- split: default
path: english_proverbs_zero_shot/default-*
- split: train
path: english_proverbs_zero_shot/train-*
- split: validation
path: english_proverbs_zero_shot/validation-*
- config_name: english_russian_proverbs_zero_shot
data_files:
- split: default
path: english_russian_proverbs_zero_shot/default-*
- split: train
path: english_russian_proverbs_zero_shot/train-*
- split: validation
path: english_russian_proverbs_zero_shot/validation-*
- config_name: entailed_polarity_hindi_zero_shot
data_files:
- split: default
path: entailed_polarity_hindi_zero_shot/default-*
- split: train
path: entailed_polarity_hindi_zero_shot/train-*
- split: validation
path: entailed_polarity_hindi_zero_shot/validation-*
- config_name: entailed_polarity_zero_shot
data_files:
- split: default
path: entailed_polarity_zero_shot/default-*
- split: train
path: entailed_polarity_zero_shot/train-*
- split: validation
path: entailed_polarity_zero_shot/validation-*
- config_name: epistemic_reasoning_zero_shot
data_files:
- split: default
path: epistemic_reasoning_zero_shot/default-*
- split: train
path: epistemic_reasoning_zero_shot/train-*
- split: validation
path: epistemic_reasoning_zero_shot/validation-*
- config_name: evaluating_information_essentiality_zero_shot
data_files:
- split: default
path: evaluating_information_essentiality_zero_shot/default-*
- split: train
path: evaluating_information_essentiality_zero_shot/train-*
- split: validation
path: evaluating_information_essentiality_zero_shot/validation-*
- config_name: fact_checker_zero_shot
data_files:
- split: default
path: fact_checker_zero_shot/default-*
- split: train
path: fact_checker_zero_shot/train-*
- split: validation
path: fact_checker_zero_shot/validation-*
- config_name: fantasy_reasoning_zero_shot
data_files:
- split: default
path: fantasy_reasoning_zero_shot/default-*
- split: train
path: fantasy_reasoning_zero_shot/train-*
- split: validation
path: fantasy_reasoning_zero_shot/validation-*
- config_name: few_shot_nlg_zero_shot
data_files:
- split: default
path: few_shot_nlg_zero_shot/default-*
- split: train
path: few_shot_nlg_zero_shot/train-*
- split: validation
path: few_shot_nlg_zero_shot/validation-*
- config_name: figure_of_speech_detection_zero_shot
data_files:
- split: default
path: figure_of_speech_detection_zero_shot/default-*
- split: train
path: figure_of_speech_detection_zero_shot/train-*
- split: validation
path: figure_of_speech_detection_zero_shot/validation-*
- config_name: formal_fallacies_syllogisms_negation_zero_shot
data_files:
- split: default
path: formal_fallacies_syllogisms_negation_zero_shot/default-*
- split: train
path: formal_fallacies_syllogisms_negation_zero_shot/train-*
- split: validation
path: formal_fallacies_syllogisms_negation_zero_shot/validation-*
- config_name: gem_zero_shot
data_files:
- split: default
path: gem_zero_shot/default-*
- split: train
path: gem_zero_shot/train-*
- split: validation
path: gem_zero_shot/validation-*
- config_name: gender_inclusive_sentences_german_zero_shot
data_files:
- split: default
path: gender_inclusive_sentences_german_zero_shot/default-*
- split: train
path: gender_inclusive_sentences_german_zero_shot/train-*
- split: validation
path: gender_inclusive_sentences_german_zero_shot/validation-*
- config_name: general_knowledge_zero_shot
data_files:
- split: default
path: general_knowledge_zero_shot/default-*
- split: train
path: general_knowledge_zero_shot/train-*
- split: validation
path: general_knowledge_zero_shot/validation-*
- config_name: geometric_shapes_zero_shot
data_files:
- split: default
path: geometric_shapes_zero_shot/default-*
- split: train
path: geometric_shapes_zero_shot/train-*
- split: validation
path: geometric_shapes_zero_shot/validation-*
- config_name: goal_step_wikihow_zero_shot
data_files:
- split: default
path: goal_step_wikihow_zero_shot/default-*
- split: train
path: goal_step_wikihow_zero_shot/train-*
- split: validation
path: goal_step_wikihow_zero_shot/validation-*
- config_name: gre_reading_comprehension_zero_shot
data_files:
- split: default
path: gre_reading_comprehension_zero_shot/default-*
- split: train
path: gre_reading_comprehension_zero_shot/train-*
- split: validation
path: gre_reading_comprehension_zero_shot/validation-*
- config_name: hhh_alignment_zero_shot
data_files:
- split: default
path: hhh_alignment_zero_shot/default-*
- split: train
path: hhh_alignment_zero_shot/train-*
- split: validation
path: hhh_alignment_zero_shot/validation-*
- config_name: hindi_question_answering_zero_shot
data_files:
- split: default
path: hindi_question_answering_zero_shot/default-*
- split: train
path: hindi_question_answering_zero_shot/train-*
- split: validation
path: hindi_question_answering_zero_shot/validation-*
- config_name: hindu_knowledge_zero_shot
data_files:
- split: default
path: hindu_knowledge_zero_shot/default-*
- split: train
path: hindu_knowledge_zero_shot/train-*
- split: validation
path: hindu_knowledge_zero_shot/validation-*
- config_name: hinglish_toxicity_zero_shot
data_files:
- split: default
path: hinglish_toxicity_zero_shot/default-*
- split: train
path: hinglish_toxicity_zero_shot/train-*
- split: validation
path: hinglish_toxicity_zero_shot/validation-*
- config_name: human_organs_senses_zero_shot
data_files:
- split: default
path: human_organs_senses_zero_shot/default-*
- split: train
path: human_organs_senses_zero_shot/train-*
- split: validation
path: human_organs_senses_zero_shot/validation-*
- config_name: hyperbaton_zero_shot
data_files:
- split: default
path: hyperbaton_zero_shot/default-*
- split: train
path: hyperbaton_zero_shot/train-*
- split: validation
path: hyperbaton_zero_shot/validation-*
- config_name: identify_math_theorems_zero_shot
data_files:
- split: default
path: identify_math_theorems_zero_shot/default-*
- split: train
path: identify_math_theorems_zero_shot/train-*
- split: validation
path: identify_math_theorems_zero_shot/validation-*
- config_name: identify_odd_metaphor_zero_shot
data_files:
- split: default
path: identify_odd_metaphor_zero_shot/default-*
- split: train
path: identify_odd_metaphor_zero_shot/train-*
- split: validation
path: identify_odd_metaphor_zero_shot/validation-*
- config_name: implicatures_zero_shot
data_files:
- split: default
path: implicatures_zero_shot/default-*
- split: train
path: implicatures_zero_shot/train-*
- split: validation
path: implicatures_zero_shot/validation-*
- config_name: implicit_relations_zero_shot
data_files:
- split: default
path: implicit_relations_zero_shot/default-*
- split: train
path: implicit_relations_zero_shot/train-*
- split: validation
path: implicit_relations_zero_shot/validation-*
- config_name: intent_recognition_zero_shot
data_files:
- split: default
path: intent_recognition_zero_shot/default-*
- split: train
path: intent_recognition_zero_shot/train-*
- split: validation
path: intent_recognition_zero_shot/validation-*
- config_name: international_phonetic_alphabet_nli_zero_shot
data_files:
- split: default
path: international_phonetic_alphabet_nli_zero_shot/default-*
- split: train
path: international_phonetic_alphabet_nli_zero_shot/train-*
- split: validation
path: international_phonetic_alphabet_nli_zero_shot/validation-*
- config_name: international_phonetic_alphabet_transliterate_zero_shot
data_files:
- split: default
path: international_phonetic_alphabet_transliterate_zero_shot/default-*
- split: train
path: international_phonetic_alphabet_transliterate_zero_shot/train-*
- split: validation
path: international_phonetic_alphabet_transliterate_zero_shot/validation-*
- config_name: intersect_geometry_zero_shot
data_files:
- split: default
path: intersect_geometry_zero_shot/default-*
- split: train
path: intersect_geometry_zero_shot/train-*
- split: validation
path: intersect_geometry_zero_shot/validation-*
- config_name: irony_identification_zero_shot
data_files:
- split: default
path: irony_identification_zero_shot/default-*
- split: train
path: irony_identification_zero_shot/train-*
- split: validation
path: irony_identification_zero_shot/validation-*
- config_name: kanji_ascii_zero_shot
data_files:
- split: default
path: kanji_ascii_zero_shot/default-*
- split: train
path: kanji_ascii_zero_shot/train-*
- split: validation
path: kanji_ascii_zero_shot/validation-*
- config_name: kannada_zero_shot
data_files:
- split: default
path: kannada_zero_shot/default-*
- split: train
path: kannada_zero_shot/train-*
- split: validation
path: kannada_zero_shot/validation-*
- config_name: key_value_maps_zero_shot
data_files:
- split: default
path: key_value_maps_zero_shot/default-*
- split: train
path: key_value_maps_zero_shot/train-*
- split: validation
path: key_value_maps_zero_shot/validation-*
- config_name: known_unknowns_zero_shot
data_files:
- split: default
path: known_unknowns_zero_shot/default-*
- split: train
path: known_unknowns_zero_shot/train-*
- split: validation
path: known_unknowns_zero_shot/validation-*
- config_name: language_games_zero_shot
data_files:
- split: default
path: language_games_zero_shot/default-*
- split: train
path: language_games_zero_shot/train-*
- split: validation
path: language_games_zero_shot/validation-*
- config_name: language_identification_zero_shot
data_files:
- split: default
path: language_identification_zero_shot/default-*
- split: train
path: language_identification_zero_shot/train-*
- split: validation
path: language_identification_zero_shot/validation-*
- config_name: linguistic_mappings_zero_shot
data_files:
- split: default
path: linguistic_mappings_zero_shot/default-*
- split: train
path: linguistic_mappings_zero_shot/train-*
- split: validation
path: linguistic_mappings_zero_shot/validation-*
- config_name: linguistics_puzzles_zero_shot
data_files:
- split: default
path: linguistics_puzzles_zero_shot/default-*
- split: train
path: linguistics_puzzles_zero_shot/train-*
- split: validation
path: linguistics_puzzles_zero_shot/validation-*
- config_name: list_functions_zero_shot
data_files:
- split: default
path: list_functions_zero_shot/default-*
- split: train
path: list_functions_zero_shot/train-*
- split: validation
path: list_functions_zero_shot/validation-*
- config_name: logic_grid_puzzle_zero_shot
data_files:
- split: default
path: logic_grid_puzzle_zero_shot/default-*
- split: train
path: logic_grid_puzzle_zero_shot/train-*
- split: validation
path: logic_grid_puzzle_zero_shot/validation-*
- config_name: logical_args_zero_shot
data_files:
- split: default
path: logical_args_zero_shot/default-*
- split: train
path: logical_args_zero_shot/train-*
- split: validation
path: logical_args_zero_shot/validation-*
- config_name: logical_deduction_zero_shot
data_files:
- split: default
path: logical_deduction_zero_shot/default-*
- split: train
path: logical_deduction_zero_shot/train-*
- split: validation
path: logical_deduction_zero_shot/validation-*
- config_name: logical_fallacy_detection_zero_shot
data_files:
- split: default
path: logical_fallacy_detection_zero_shot/default-*
- split: train
path: logical_fallacy_detection_zero_shot/train-*
- split: validation
path: logical_fallacy_detection_zero_shot/validation-*
- config_name: logical_sequence_zero_shot
data_files:
- split: default
path: logical_sequence_zero_shot/default-*
- split: train
path: logical_sequence_zero_shot/train-*
- split: validation
path: logical_sequence_zero_shot/validation-*
- config_name: mathematical_induction_zero_shot
data_files:
- split: default
path: mathematical_induction_zero_shot/default-*
- split: train
path: mathematical_induction_zero_shot/train-*
- split: validation
path: mathematical_induction_zero_shot/validation-*
- config_name: matrixshapes_zero_shot
data_files:
- split: default
path: matrixshapes_zero_shot/default-*
- split: train
path: matrixshapes_zero_shot/train-*
- split: validation
path: matrixshapes_zero_shot/validation-*
- config_name: metaphor_boolean_zero_shot
data_files:
- split: default
path: metaphor_boolean_zero_shot/default-*
- split: train
path: metaphor_boolean_zero_shot/train-*
- split: validation
path: metaphor_boolean_zero_shot/validation-*
- config_name: metaphor_understanding_zero_shot
data_files:
- split: default
path: metaphor_understanding_zero_shot/default-*
- split: train
path: metaphor_understanding_zero_shot/train-*
- split: validation
path: metaphor_understanding_zero_shot/validation-*
- config_name: minute_mysteries_qa_zero_shot
data_files:
- split: default
path: minute_mysteries_qa_zero_shot/default-*
- split: train
path: minute_mysteries_qa_zero_shot/train-*
- split: validation
path: minute_mysteries_qa_zero_shot/validation-*
- config_name: misconceptions_russian_zero_shot
data_files:
- split: default
path: misconceptions_russian_zero_shot/default-*
- split: train
path: misconceptions_russian_zero_shot/train-*
- split: validation
path: misconceptions_russian_zero_shot/validation-*
- config_name: misconceptions_zero_shot
data_files:
- split: default
path: misconceptions_zero_shot/default-*
- split: train
path: misconceptions_zero_shot/train-*
- split: validation
path: misconceptions_zero_shot/validation-*
- config_name: mnist_ascii_zero_shot
data_files:
- split: default
path: mnist_ascii_zero_shot/default-*
- split: train
path: mnist_ascii_zero_shot/train-*
- split: validation
path: mnist_ascii_zero_shot/validation-*
- config_name: modified_arithmetic_zero_shot
data_files:
- split: default
path: modified_arithmetic_zero_shot/default-*
- split: train
path: modified_arithmetic_zero_shot/train-*
- split: validation
path: modified_arithmetic_zero_shot/validation-*
- config_name: moral_permissibility_zero_shot
data_files:
- split: default
path: moral_permissibility_zero_shot/default-*
- split: train
path: moral_permissibility_zero_shot/train-*
- split: validation
path: moral_permissibility_zero_shot/validation-*
- config_name: movie_dialog_same_or_different_zero_shot
data_files:
- split: default
path: movie_dialog_same_or_different_zero_shot/default-*
- split: train
path: movie_dialog_same_or_different_zero_shot/train-*
- split: validation
path: movie_dialog_same_or_different_zero_shot/validation-*
- config_name: movie_recommendation_zero_shot
data_files:
- split: default
path: movie_recommendation_zero_shot/default-*
- split: train
path: movie_recommendation_zero_shot/train-*
- split: validation
path: movie_recommendation_zero_shot/validation-*
- config_name: mult_data_wrangling_zero_shot
data_files:
- split: default
path: mult_data_wrangling_zero_shot/default-*
- split: train
path: mult_data_wrangling_zero_shot/train-*
- split: validation
path: mult_data_wrangling_zero_shot/validation-*
- config_name: multiemo_zero_shot
data_files:
- split: default
path: multiemo_zero_shot/default-*
- split: train
path: multiemo_zero_shot/train-*
- split: validation
path: multiemo_zero_shot/validation-*
- config_name: natural_instructions_zero_shot
data_files:
- split: default
path: natural_instructions_zero_shot/default-*
- split: train
path: natural_instructions_zero_shot/train-*
- split: validation
path: natural_instructions_zero_shot/validation-*
- config_name: navigate_zero_shot
data_files:
- split: default
path: navigate_zero_shot/default-*
- split: train
path: navigate_zero_shot/train-*
- split: validation
path: navigate_zero_shot/validation-*
- config_name: nonsense_words_grammar_zero_shot
data_files:
- split: default
path: nonsense_words_grammar_zero_shot/default-*
- split: train
path: nonsense_words_grammar_zero_shot/train-*
- split: validation
path: nonsense_words_grammar_zero_shot/validation-*
- config_name: novel_concepts_zero_shot
data_files:
- split: default
path: novel_concepts_zero_shot/default-*
- split: train
path: novel_concepts_zero_shot/train-*
- split: validation
path: novel_concepts_zero_shot/validation-*
- config_name: object_counting_zero_shot
data_files:
- split: default
path: object_counting_zero_shot/default-*
- split: train
path: object_counting_zero_shot/train-*
- split: validation
path: object_counting_zero_shot/validation-*
- config_name: odd_one_out_zero_shot
data_files:
- split: default
path: odd_one_out_zero_shot/default-*
- split: train
path: odd_one_out_zero_shot/train-*
- split: validation
path: odd_one_out_zero_shot/validation-*
- config_name: operators_zero_shot
data_files:
- split: default
path: operators_zero_shot/default-*
- split: train
path: operators_zero_shot/train-*
- split: validation
path: operators_zero_shot/validation-*
- config_name: paragraph_segmentation_zero_shot
data_files:
- split: default
path: paragraph_segmentation_zero_shot/default-*
- split: train
path: paragraph_segmentation_zero_shot/train-*
- split: validation
path: paragraph_segmentation_zero_shot/validation-*
- config_name: parsinlu_qa_zero_shot
data_files:
- split: default
path: parsinlu_qa_zero_shot/default-*
- split: train
path: parsinlu_qa_zero_shot/train-*
- split: validation
path: parsinlu_qa_zero_shot/validation-*
- config_name: parsinlu_reading_comprehension_zero_shot
data_files:
- split: default
path: parsinlu_reading_comprehension_zero_shot/default-*
- split: train
path: parsinlu_reading_comprehension_zero_shot/train-*
- split: validation
path: parsinlu_reading_comprehension_zero_shot/validation-*
- config_name: penguins_in_a_table_zero_shot
data_files:
- split: default
path: penguins_in_a_table_zero_shot/default-*
- split: train
path: penguins_in_a_table_zero_shot/train-*
- split: validation
path: penguins_in_a_table_zero_shot/validation-*
- config_name: periodic_elements_zero_shot
data_files:
- split: default
path: periodic_elements_zero_shot/default-*
- split: train
path: periodic_elements_zero_shot/train-*
- split: validation
path: periodic_elements_zero_shot/validation-*
- config_name: persian_idioms_zero_shot
data_files:
- split: default
path: persian_idioms_zero_shot/default-*
- split: train
path: persian_idioms_zero_shot/train-*
- split: validation
path: persian_idioms_zero_shot/validation-*
- config_name: phrase_relatedness_zero_shot
data_files:
- split: default
path: phrase_relatedness_zero_shot/default-*
- split: train
path: phrase_relatedness_zero_shot/train-*
- split: validation
path: phrase_relatedness_zero_shot/validation-*
- config_name: physical_intuition_zero_shot
data_files:
- split: default
path: physical_intuition_zero_shot/default-*
- split: train
path: physical_intuition_zero_shot/train-*
- split: validation
path: physical_intuition_zero_shot/validation-*
- config_name: physics_questions_zero_shot
data_files:
- split: default
path: physics_questions_zero_shot/default-*
- split: train
path: physics_questions_zero_shot/train-*
- split: validation
path: physics_questions_zero_shot/validation-*
- config_name: physics_zero_shot
data_files:
- split: default
path: physics_zero_shot/default-*
- split: train
path: physics_zero_shot/train-*
- split: validation
path: physics_zero_shot/validation-*
- config_name: play_dialog_same_or_different_zero_shot
data_files:
- split: default
path: play_dialog_same_or_different_zero_shot/default-*
- split: train
path: play_dialog_same_or_different_zero_shot/train-*
- split: validation
path: play_dialog_same_or_different_zero_shot/validation-*
- config_name: polish_sequence_labeling_zero_shot
data_files:
- split: default
path: polish_sequence_labeling_zero_shot/default-*
- split: train
path: polish_sequence_labeling_zero_shot/train-*
- split: validation
path: polish_sequence_labeling_zero_shot/validation-*
- config_name: presuppositions_as_nli_zero_shot
data_files:
- split: default
path: presuppositions_as_nli_zero_shot/default-*
- split: train
path: presuppositions_as_nli_zero_shot/train-*
- split: validation
path: presuppositions_as_nli_zero_shot/validation-*
- config_name: qa_wikidata_zero_shot
data_files:
- split: default
path: qa_wikidata_zero_shot/default-*
- split: train
path: qa_wikidata_zero_shot/train-*
- split: validation
path: qa_wikidata_zero_shot/validation-*
- config_name: question_selection_zero_shot
data_files:
- split: default
path: question_selection_zero_shot/default-*
- split: train
path: question_selection_zero_shot/train-*
- split: validation
path: question_selection_zero_shot/validation-*
- config_name: real_or_fake_text_zero_shot
data_files:
- split: default
path: real_or_fake_text_zero_shot/default-*
- split: train
path: real_or_fake_text_zero_shot/train-*
- split: validation
path: real_or_fake_text_zero_shot/validation-*
- config_name: reasoning_about_colored_objects_zero_shot
data_files:
- split: default
path: reasoning_about_colored_objects_zero_shot/default-*
- split: train
path: reasoning_about_colored_objects_zero_shot/train-*
- split: validation
path: reasoning_about_colored_objects_zero_shot/validation-*
- config_name: repeat_copy_logic_zero_shot
data_files:
- split: default
path: repeat_copy_logic_zero_shot/default-*
- split: train
path: repeat_copy_logic_zero_shot/train-*
- split: validation
path: repeat_copy_logic_zero_shot/validation-*
- config_name: rephrase_zero_shot
data_files:
- split: default
path: rephrase_zero_shot/default-*
- split: train
path: rephrase_zero_shot/train-*
- split: validation
path: rephrase_zero_shot/validation-*
- config_name: riddle_sense_zero_shot
data_files:
- split: default
path: riddle_sense_zero_shot/default-*
- split: train
path: riddle_sense_zero_shot/train-*
- split: validation
path: riddle_sense_zero_shot/validation-*
- config_name: ruin_names_zero_shot
data_files:
- split: default
path: ruin_names_zero_shot/default-*
- split: train
path: ruin_names_zero_shot/train-*
- split: validation
path: ruin_names_zero_shot/validation-*
- config_name: salient_translation_error_detection_zero_shot
data_files:
- split: default
path: salient_translation_error_detection_zero_shot/default-*
- split: train
path: salient_translation_error_detection_zero_shot/train-*
- split: validation
path: salient_translation_error_detection_zero_shot/validation-*
- config_name: scientific_press_release_zero_shot
data_files:
- split: default
path: scientific_press_release_zero_shot/default-*
- split: train
path: scientific_press_release_zero_shot/train-*
- split: validation
path: scientific_press_release_zero_shot/validation-*
- config_name: semantic_parsing_in_context_sparc_zero_shot
data_files:
- split: default
path: semantic_parsing_in_context_sparc_zero_shot/default-*
- split: train
path: semantic_parsing_in_context_sparc_zero_shot/train-*
- split: validation
path: semantic_parsing_in_context_sparc_zero_shot/validation-*
- config_name: semantic_parsing_spider_zero_shot
data_files:
- split: default
path: semantic_parsing_spider_zero_shot/default-*
- split: train
path: semantic_parsing_spider_zero_shot/train-*
- split: validation
path: semantic_parsing_spider_zero_shot/validation-*
- config_name: sentence_ambiguity_zero_shot
data_files:
- split: default
path: sentence_ambiguity_zero_shot/default-*
- split: train
path: sentence_ambiguity_zero_shot/train-*
- split: validation
path: sentence_ambiguity_zero_shot/validation-*
- config_name: similarities_abstraction_zero_shot
data_files:
- split: default
path: similarities_abstraction_zero_shot/default-*
- split: train
path: similarities_abstraction_zero_shot/train-*
- split: validation
path: similarities_abstraction_zero_shot/validation-*
- config_name: simp_turing_concept_zero_shot
data_files:
- split: default
path: simp_turing_concept_zero_shot/default-*
- split: train
path: simp_turing_concept_zero_shot/train-*
- split: validation
path: simp_turing_concept_zero_shot/validation-*
- config_name: simple_arithmetic_json_multiple_choice_zero_shot
data_files:
- split: default
path: simple_arithmetic_json_multiple_choice_zero_shot/default-*
- split: train
path: simple_arithmetic_json_multiple_choice_zero_shot/train-*
- split: validation
path: simple_arithmetic_json_multiple_choice_zero_shot/validation-*
- config_name: simple_arithmetic_json_subtasks_zero_shot
data_files:
- split: default
path: simple_arithmetic_json_subtasks_zero_shot/default-*
- split: train
path: simple_arithmetic_json_subtasks_zero_shot/train-*
- split: validation
path: simple_arithmetic_json_subtasks_zero_shot/validation-*
- config_name: simple_arithmetic_json_zero_shot
data_files:
- split: default
path: simple_arithmetic_json_zero_shot/default-*
- split: train
path: simple_arithmetic_json_zero_shot/train-*
- split: validation
path: simple_arithmetic_json_zero_shot/validation-*
- config_name: simple_arithmetic_multiple_targets_json_zero_shot
data_files:
- split: default
path: simple_arithmetic_multiple_targets_json_zero_shot/default-*
- split: train
path: simple_arithmetic_multiple_targets_json_zero_shot/train-*
- split: validation
path: simple_arithmetic_multiple_targets_json_zero_shot/validation-*
- config_name: simple_ethical_questions_zero_shot
data_files:
- split: default
path: simple_ethical_questions_zero_shot/default-*
- split: train
path: simple_ethical_questions_zero_shot/train-*
- split: validation
path: simple_ethical_questions_zero_shot/validation-*
- config_name: simple_text_editing_zero_shot
data_files:
- split: default
path: simple_text_editing_zero_shot/default-*
- split: train
path: simple_text_editing_zero_shot/train-*
- split: validation
path: simple_text_editing_zero_shot/validation-*
- config_name: snarks_zero_shot
data_files:
- split: default
path: snarks_zero_shot/default-*
- split: train
path: snarks_zero_shot/train-*
- split: validation
path: snarks_zero_shot/validation-*
- config_name: social_iqa_zero_shot
data_files:
- split: default
path: social_iqa_zero_shot/default-*
- split: train
path: social_iqa_zero_shot/train-*
- split: validation
path: social_iqa_zero_shot/validation-*
- config_name: social_support_zero_shot
data_files:
- split: default
path: social_support_zero_shot/default-*
- split: train
path: social_support_zero_shot/train-*
- split: validation
path: social_support_zero_shot/validation-*
- config_name: sports_understanding_zero_shot
data_files:
- split: default
path: sports_understanding_zero_shot/default-*
- split: train
path: sports_understanding_zero_shot/train-*
- split: validation
path: sports_understanding_zero_shot/validation-*
- config_name: strange_stories_zero_shot
data_files:
- split: default
path: strange_stories_zero_shot/default-*
- split: train
path: strange_stories_zero_shot/train-*
- split: validation
path: strange_stories_zero_shot/validation-*
- config_name: strategyqa_zero_shot
data_files:
- split: default
path: strategyqa_zero_shot/default-*
- split: train
path: strategyqa_zero_shot/train-*
- split: validation
path: strategyqa_zero_shot/validation-*
- config_name: sufficient_information_zero_shot
data_files:
- split: default
path: sufficient_information_zero_shot/default-*
- split: train
path: sufficient_information_zero_shot/train-*
- split: validation
path: sufficient_information_zero_shot/validation-*
- config_name: suicide_risk_zero_shot
data_files:
- split: default
path: suicide_risk_zero_shot/default-*
- split: train
path: suicide_risk_zero_shot/train-*
- split: validation
path: suicide_risk_zero_shot/validation-*
- config_name: swahili_english_proverbs_zero_shot
data_files:
- split: default
path: swahili_english_proverbs_zero_shot/default-*
- split: train
path: swahili_english_proverbs_zero_shot/train-*
- split: validation
path: swahili_english_proverbs_zero_shot/validation-*
- config_name: swedish_to_german_proverbs_zero_shot
data_files:
- split: default
path: swedish_to_german_proverbs_zero_shot/default-*
- split: train
path: swedish_to_german_proverbs_zero_shot/train-*
- split: validation
path: swedish_to_german_proverbs_zero_shot/validation-*
- config_name: symbol_interpretation_zero_shot
data_files:
- split: default
path: symbol_interpretation_zero_shot/default-*
- split: train
path: symbol_interpretation_zero_shot/train-*
- split: validation
path: symbol_interpretation_zero_shot/validation-*
- config_name: temporal_sequences_zero_shot
data_files:
- split: default
path: temporal_sequences_zero_shot/default-*
- split: train
path: temporal_sequences_zero_shot/train-*
- split: validation
path: temporal_sequences_zero_shot/validation-*
- config_name: tense_zero_shot
data_files:
- split: default
path: tense_zero_shot/default-*
- split: train
path: tense_zero_shot/train-*
- split: validation
path: tense_zero_shot/validation-*
- config_name: timedial_zero_shot
data_files:
- split: default
path: timedial_zero_shot/default-*
- split: train
path: timedial_zero_shot/train-*
- split: validation
path: timedial_zero_shot/validation-*
- config_name: topical_chat_zero_shot
data_files:
- split: default
path: topical_chat_zero_shot/default-*
- split: train
path: topical_chat_zero_shot/train-*
- split: validation
path: topical_chat_zero_shot/validation-*
- config_name: tracking_shuffled_objects_zero_shot
data_files:
- split: default
path: tracking_shuffled_objects_zero_shot/default-*
- split: train
path: tracking_shuffled_objects_zero_shot/train-*
- split: validation
path: tracking_shuffled_objects_zero_shot/validation-*
- config_name: understanding_fables_zero_shot
data_files:
- split: default
path: understanding_fables_zero_shot/default-*
- split: train
path: understanding_fables_zero_shot/train-*
- split: validation
path: understanding_fables_zero_shot/validation-*
- config_name: undo_permutation_zero_shot
data_files:
- split: default
path: undo_permutation_zero_shot/default-*
- split: train
path: undo_permutation_zero_shot/train-*
- split: validation
path: undo_permutation_zero_shot/validation-*
- config_name: unit_conversion_zero_shot
data_files:
- split: default
path: unit_conversion_zero_shot/default-*
- split: train
path: unit_conversion_zero_shot/train-*
- split: validation
path: unit_conversion_zero_shot/validation-*
- config_name: unit_interpretation_zero_shot
data_files:
- split: default
path: unit_interpretation_zero_shot/default-*
- split: train
path: unit_interpretation_zero_shot/train-*
- split: validation
path: unit_interpretation_zero_shot/validation-*
- config_name: unnatural_in_context_learning_zero_shot
data_files:
- split: default
path: unnatural_in_context_learning_zero_shot/default-*
- split: train
path: unnatural_in_context_learning_zero_shot/train-*
- split: validation
path: unnatural_in_context_learning_zero_shot/validation-*
- config_name: vitaminc_fact_verification_zero_shot
data_files:
- split: default
path: vitaminc_fact_verification_zero_shot/default-*
- split: train
path: vitaminc_fact_verification_zero_shot/train-*
- split: validation
path: vitaminc_fact_verification_zero_shot/validation-*
- config_name: what_is_the_tao_zero_shot
data_files:
- split: default
path: what_is_the_tao_zero_shot/default-*
- split: train
path: what_is_the_tao_zero_shot/train-*
- split: validation
path: what_is_the_tao_zero_shot/validation-*
- config_name: which_wiki_edit_zero_shot
data_files:
- split: default
path: which_wiki_edit_zero_shot/default-*
- split: train
path: which_wiki_edit_zero_shot/train-*
- split: validation
path: which_wiki_edit_zero_shot/validation-*
- config_name: winowhy_zero_shot
data_files:
- split: default
path: winowhy_zero_shot/default-*
- split: train
path: winowhy_zero_shot/train-*
- split: validation
path: winowhy_zero_shot/validation-*
- config_name: word_sorting_zero_shot
data_files:
- split: default
path: word_sorting_zero_shot/default-*
- split: train
path: word_sorting_zero_shot/train-*
- split: validation
path: word_sorting_zero_shot/validation-*
- config_name: word_unscrambling_zero_shot
data_files:
- split: default
path: word_unscrambling_zero_shot/default-*
- split: train
path: word_unscrambling_zero_shot/train-*
- split: validation
path: word_unscrambling_zero_shot/validation-*
---
# Dataset Card for "bigbench"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-40k | ---
pretty_name: Evaluation run of JunchengXie/Mistral-7B-v0.1-gpt-4-40k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JunchengXie/Mistral-7B-v0.1-gpt-4-40k](https://huggingface.co/JunchengXie/Mistral-7B-v0.1-gpt-4-40k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-40k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T18:02:15.085072](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-40k/blob/main/results_2024-03-13T18-02-15.085072.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6263136416795934,\n\
\ \"acc_stderr\": 0.032710888919037076,\n \"acc_norm\": 0.6322580832470024,\n\
\ \"acc_norm_stderr\": 0.03336246855030227,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5489012537136961,\n\
\ \"mc2_stderr\": 0.015307134993365014\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467325,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6150169288986258,\n\
\ \"acc_stderr\": 0.004855968578998724,\n \"acc_norm\": 0.8149770961959769,\n\
\ \"acc_norm_stderr\": 0.003875225369365732\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.0252798503974049,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.0252798503974049\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462836,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462836\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406215,\n \
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406215\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391534,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n\
\ \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n\
\ \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580214,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580214\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507215,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507215\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484375,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484375\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5489012537136961,\n\
\ \"mc2_stderr\": 0.015307134993365014\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637564\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \
\ \"acc_stderr\": 0.013442502402794302\n }\n}\n```"
repo_url: https://huggingface.co/JunchengXie/Mistral-7B-v0.1-gpt-4-40k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-02-15.085072.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-02-15.085072.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- '**/details_harness|winogrande|5_2024-03-13T18-02-15.085072.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T18-02-15.085072.parquet'
- config_name: results
data_files:
- split: 2024_03_13T18_02_15.085072
path:
- results_2024-03-13T18-02-15.085072.parquet
- split: latest
path:
- results_2024-03-13T18-02-15.085072.parquet
---
# Dataset Card for Evaluation run of JunchengXie/Mistral-7B-v0.1-gpt-4-40k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JunchengXie/Mistral-7B-v0.1-gpt-4-40k](https://huggingface.co/JunchengXie/Mistral-7B-v0.1-gpt-4-40k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-40k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T18:02:15.085072](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Mistral-7B-v0.1-gpt-4-40k/blob/main/results_2024-03-13T18-02-15.085072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6263136416795934,
"acc_stderr": 0.032710888919037076,
"acc_norm": 0.6322580832470024,
"acc_norm_stderr": 0.03336246855030227,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5489012537136961,
"mc2_stderr": 0.015307134993365014
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467325,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.6150169288986258,
"acc_stderr": 0.004855968578998724,
"acc_norm": 0.8149770961959769,
"acc_norm_stderr": 0.003875225369365732
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.0252798503974049,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.0252798503974049
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406215,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406215
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394849,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394849
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391534,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580214,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580214
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507215,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484375,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484375
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5489012537136961,
"mc2_stderr": 0.015307134993365014
},
"harness|winogrande|5": {
"acc": 0.7379636937647988,
"acc_stderr": 0.012358944431637564
},
"harness|gsm8k|5": {
"acc": 0.3912054586808188,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LIUM/tedlium | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license: []
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- automatic-speech-recognition
task_ids: []
pretty_name: TED-LIUM
---
# Dataset Card for tedlium
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [TED-LIUM homepage](https://www.openslr.org/7/)
- **Repository:** [Needs More Information]
- **Paper:** [TED-LIUM: an Automatic Speech Recognition dedicated corpus](https://aclanthology.org/L12-1405/)
- **Leaderboard:** [Paperswithcode Leaderboard](https://paperswithcode.com/sota/speech-recognition-on-tedlium)
- **Point of Contact:** [Sanchit Gandhi](mailto:sanchit@huggingface.co)
### Dataset Summary
The TED-LIUM corpus is English-language TED talks, with transcriptions, sampled at 16kHz. The three releases of the corpus range from 118 to 452 hours of transcribed speech data.
### Example
```python
from datasets import load_dataset
tedlium = load_dataset("LIUM/tedlium", "release1") # for Release 1
# see structure
print(tedlium)
# load audio sample on the fly
audio_input = tedlium["train"][0]["audio"] # first decoded audio sample
transcription = tedlium["train"][0]["text"] # first transcription
```
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active leaderboard which can be found at https://paperswithcode.com/sota/speech-recognition-on-tedlium that ranks models based on their WER.
### Languages
The audio and transcriptions are in English, as per the TED talks at http://www.ted.com.
## Dataset Structure
### Data Instances
```
{'audio': {'path': '/home/sanchitgandhi/cache/downloads/extracted/6e3655f9e735ae3c467deed1df788e0dabd671c1f3e2e386e30aa3b571bd9761/TEDLIUM_release1/train/sph/PaulaScher_2008P.sph',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346,
0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000},
'text': '{COUGH} but <sil> i was so {COUGH} utterly unqualified for(2) this project and {NOISE} so utterly ridiculous {SMACK} and ignored the brief {SMACK} <sil>',
'speaker_id': 'PaulaScher_2008P',
'gender': 'female',
'file': '/home/sanchitgandhi/cache/downloads/extracted/6e3655f9e735ae3c467deed1df788e0dabd671c1f3e2e386e30aa3b571bd9761/TEDLIUM_release1/train/sph/PaulaScher_2008P.sph',
'id': 'PaulaScher_2008P-1003.35-1011.16-<o,f0,female>'}
```
### Data Fields
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- file: A path to the downloaded audio file in .sph format.
- text: the transcription of the audio file.
- gender: the gender of the speaker. One of: male, female or N/A.
- id: unique id of the data sample.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
### Data Splits
There are three releases for the TED-LIUM corpus, progressively increasing the number of transcribed speech training data from 118 hours (Release 1), to 207 hours (Release 2), to 452 hours (Release 3).
Release 1:
- 774 audio talks and automatically aligned transcriptions.
- Contains 118 hours of speech audio data.
- Homepage: https://www.openslr.org/7/
Release 2:
- 1495 audio talks and automatically aligned transcriptions.
- Contains 207 hours of speech audio data.
- Dictionary with pronunciations (159848 entries).
- Selected monolingual data for language modeling from WMT12 publicly available corpora.
- Homepage: https://www.openslr.org/19/
Release 3:
- 2351 audio talks and automatically aligned transcriptions.
- Contains 452 hours of speech audio data.
- TED-LIUM 2 validation and test data: 19 TED talks with their corresponding manual transcriptions.
- Dictionary with pronunciations (159848 entries), the same file as the one included in TED-LIUM 2.
- Selected monolingual data for language modeling from WMT12 publicly available corpora: these files come from the TED-LIUM 2 release, but have been modified to produce a tokenization more relevant for English language.
- Homepage: https://www.openslr.org/51/
Release 3 contains two different corpus distributions:
- The ‘legacy’ one, on which the dev and test datasets are the same as in TED-LIUM 2 (and TED-LIUM 1).
- The ‘speaker adaptation’ one, specially designed for experiments on speaker adaptation.
Each release is split into a training, validation and test set:
| Split | Release 1 | Release 2 | Release 3 |
|------------|-----------|-----------|-----------|
| Train | 56,803 | 92,973 | 268,263 |
| Validation | 591 | 591 | 591 |
| Test | 1,469 | 1,469 | 1,469 |
## Dataset Creation
### Curation Rationale
TED-LIUM was built during [The International Workshop on Spoken Language Trans- lation (IWSLT) 2011 Evaluation Campaign](https://aclanthology.org/2011.iwslt-evaluation.1/), an annual workshop focused on the automatic translation of public talks and included tracks for speech recognition, speech translation, text translation, and system combination.
### Source Data
#### Initial Data Collection and Normalization
The data was obtained from publicly available TED talks at http://www.ted.com. Proper alignments between the speech and the transcribed text were generated using an in-house speaker segmentation and clustering tool (_LIUM_SpkDiarization_). Speech disfluencies (e.g. repetitions, hesitations, false starts) were treated in the following way: repetitions were transcribed, hesitations mapped to a specific filler word, and false starts not taken into account. For full details on the data collection and processing, refer to the [TED-LIUM paper](https://aclanthology.org/L12-1405/).
#### Who are the source language producers?
TED Talks are influential videos from expert speakers on education, business, science, tech and creativity.
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Licensed under Creative Commons BY-NC-ND 3.0 (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.en).
### Citation Information
Release 1:
```
@inproceedings{rousseau2012tedlium,
title={TED-LIUM: an Automatic Speech Recognition dedicated corpus},
author={Rousseau, Anthony and Del{\'e}glise, Paul and Est{\`e}ve, Yannick},
booktitle={Conference on Language Resources and Evaluation (LREC)},
pages={125--129},
year={2012}
}
```
Release 2:
```
@inproceedings{rousseau2014enhancing,
title={Enhancing the TED-LIUM corpus with selected data for language modeling and more TED talks.},
author={Rousseau, Anthony and Del{\'e}glise, Paul and Esteve, Yannick and others},
booktitle={LREC},
pages={3935--3939},
year={2014}
}
```
Release 3:
```
@inproceedings{hernandez2018ted,
author="Hernandez, Fran{\c{c}}ois
and Nguyen, Vincent
and Ghannay, Sahar
and Tomashenko, Natalia
and Est{\`e}ve, Yannick",
title="TED-LIUM 3: Twice as Much Data and Corpus Repartition for Experiments on Speaker Adaptation",
booktitle="Speech and Computer",
year="2018",
publisher="Springer International Publishing",
pages="198--208",
}
``` |
andersonaugustome/aime2 | ---
license: apache-2.0
---
|
Helsinki-NLP/eitb_parcc | ---
annotations_creators:
- found
language_creators:
- found
language:
- es
- eu
license: cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: eitb-parcc
pretty_name: EiTB-ParCC
dataset_info:
config_name: es-eu
features:
- name: translation
dtype:
translation:
languages:
- es
- eu
splits:
- name: train
num_bytes: 139038886
num_examples: 637183
download_size: 96930125
dataset_size: 139038886
configs:
- config_name: es-eu
data_files:
- split: train
path: es-eu/train-*
default: true
---
# Dataset Card for EiTB-ParCC
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://opus.nlpl.eu/EiTB-ParCC/corpus/version/EiTB-ParCC
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** https://aclanthology.org/2020.lrec-1.469/
- **Leaderboard:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
EiTB-ParCC: Parallel Corpus of Comparable News. A Basque-Spanish parallel corpus provided by
Vicomtech (https://www.vicomtech.org), extracted from comparable news produced by the
Basque public broadcasting group [Euskal Irrati Telebista](https://www.eitb.eus/).
### Supported Tasks and Leaderboards
Translation.
### Languages
The languages in the dataset are:
- Spanish (`es`)
- Basque (`eu`)
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The corpus is distributed under the Creative Commons BY-NC-SA 4.0 license.
### Citation Information
If you use any part of this corpus in your own work, please cite the following:
```
@inproceedings{etchegoyhen-gete-2020-handle,
title = "Handle with Care: A Case Study in Comparable Corpora Exploitation for Neural Machine Translation",
author = "Etchegoyhen, Thierry and
Gete, Harritxu",
editor = "Calzolari, Nicoletta and
B{\'e}chet, Fr{\'e}d{\'e}ric and
Blache, Philippe and
Choukri, Khalid and
Cieri, Christopher and
Declerck, Thierry and
Goggi, Sara and
Isahara, Hitoshi and
Maegaard, Bente and
Mariani, Joseph and
Mazo, H{\'e}l{\`e}ne and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Twelfth Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.469",
pages = "3799--3807",
language = "English",
ISBN = "979-10-95546-34-4",
}
```
```
@InProceedings{TIEDEMANN12.463,
author = {J{\"o}rg Tiedemann},
title = {Parallel Data, Tools and Interfaces in OPUS},
booktitle = {Proceedings of the Eight International Conference on Language Resources and Evaluation (LREC'12)},
year = {2012},
month = {may},
date = {23-25},
address = {Istanbul, Turkey},
editor = {Nicoletta Calzolari (Conference Chair) and Khalid Choukri and Thierry Declerck and Mehmet Ugur Dogan and Bente Maegaard and Joseph Mariani and Jan Odijk and Stelios Piperidis},
publisher = {European Language Resources Association (ELRA)},
isbn = {978-2-9517408-7-7},
language = {english}
}
```
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ | ---
pretty_name: Evaluation run of TheBloke/Project-Baize-v2-7B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Project-Baize-v2-7B-GPTQ](https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T21:24:57.179060](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ/blob/main/results_2023-10-22T21-24-57.179060.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196460633,\n \"f1\": 0.05739828020134247,\n\
\ \"f1_stderr\": 0.001324280220685328,\n \"acc\": 0.36097040821028104,\n\
\ \"acc_stderr\": 0.00860938625459939\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460633,\n\
\ \"f1\": 0.05739828020134247,\n \"f1_stderr\": 0.001324280220685328\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \
\ \"acc_stderr\": 0.0043020450465643045\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634475\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T21_24_57.179060
path:
- '**/details_harness|drop|3_2023-10-22T21-24-57.179060.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T21-24-57.179060.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T21_24_57.179060
path:
- '**/details_harness|gsm8k|5_2023-10-22T21-24-57.179060.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T21-24-57.179060.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T21_24_57.179060
path:
- '**/details_harness|winogrande|5_2023-10-22T21-24-57.179060.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T21-24-57.179060.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- results_2023-08-29T19:38:18.380876.parquet
- split: 2023_10_22T21_24_57.179060
path:
- results_2023-10-22T21-24-57.179060.parquet
- split: latest
path:
- results_2023-10-22T21-24-57.179060.parquet
---
# Dataset Card for Evaluation run of TheBloke/Project-Baize-v2-7B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Project-Baize-v2-7B-GPTQ](https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T21:24:57.179060](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ/blob/main/results_2023-10-22T21-24-57.179060.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460633,
"f1": 0.05739828020134247,
"f1_stderr": 0.001324280220685328,
"acc": 0.36097040821028104,
"acc_stderr": 0.00860938625459939
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460633,
"f1": 0.05739828020134247,
"f1_stderr": 0.001324280220685328
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.0043020450465643045
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634475
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
michaelmallari/airbnb-usa-tn-nashville | ---
license: mit
---
|
mesolitica/chatgpt-kg-triplets | ---
language:
- ms
pretty_name: malay-kg-triplets
---
# Knowledge Graph Triplet format
Generated using ChatGPT3.5 on,
1. Astroawani news, https://github.com/mesolitica/malaysian-dataset/tree/master/knowledge-graph/chatgpt-astroawani, [kg-astroawani.translated.jsonl](kg-astroawani.translated.jsonl), 9162 rows, 125 MB
2. MS Wikipedia, https://github.com/mesolitica/malaysian-dataset/tree/master/knowledge-graph/chatgpt-wikipedia, [kg-paragraph-wikipedia.translated.jsonl](kg-paragraph-wikipedia.translated.jsonl), 25032 rows, 166 MB
## Example data
```json
{'id': 221733,
'title': "Padah jalin hubungan sulit dengan pekerja sendiri, CEO McDonald's dipecat serta merta",
'description': 'CEO tidak boleh menjalin hubungan dengan mana-mana kakitangan.',
'body': ["SYARIKAT rantaian makanan segera terkemuka dunia, McDonald's Corp mengesahkan telah memecat Ketua Pegawai Eksekutif (CEO), Steve Easterbrook selepas menjalinkan hubungan sulit dengan salah seorang kakitangannya.",
"Menurut McDonald's dalam satu kenyataan, tindakan tersebut diambil berikutan Easterbrook, 52, didakwa melanggar polisi syarikat, yang tidak membenarkan CEO mempunyai hubungan dengan mana-mana kakitangan syarikat.",
"Susulan pemecatan tersebut, restoran terbesar dunia itu melantik bekas presiden McDonald's Amerika Syarikat (AS), Chris Kempczinski, sebagai CEO baharu berkuat kuasa serta-merta.",
'Sementara itu, Easterbrook menerusi emel kepada kakitangannya mengakui hubungan tersebut merupakan "satu kesilapan" yang bertentangan dengan dasar syarikat.',
'"Mengambil nilai syarikat ini, saya bersetuju untuk mengundurkan diri," demikian katanya.',
"Easterbrook pernah bercerai dan memulakan kerjaya dengan McDonald's pada tahun 1993 sebagai pengurus di London sebelum dinaikkan pangkat.",
"Beliau dilantik sebagai CEO McDonald's Corporation pada tahun 2015. -"],
'title_kg': {'triplets': [{'subject': 'Padah',
'predicate': 'memiliki',
'object': 'hubungan sulit'},
{'subject': 'hubungan sulit',
'predicate': 'dengan',
'object': 'pekerja sendiri'},
{'subject': 'Padah', 'predicate': 'dipecat', 'object': "CEO McDonald's"}]},
'description_kg': {'triplets': [{'subject': 'CEO',
'predicate': 'tidak boleh menjalin hubungan dengan',
'object': 'kakitangan'}]},
'body_kg': [["SYARIKAT rantaian makanan segera terkemuka dunia, McDonald's Corp mengesahkan telah memecat Ketua Pegawai Eksekutif (CEO), Steve Easterbrook selepas menjalinkan hubungan sulit dengan salah seorang kakitangannya.",
{'triplets': [{'subject': "McDonald's Corp",
'predicate': 'is a',
'object': "world's leading fast food chain company"},
{'subject': "McDonald's Corp",
'predicate': 'confirmed',
'object': 'firing CEO Steve Easterbrook'},
{'subject': 'Steve Easterbrook',
'predicate': 'had',
'object': 'an inappropriate relationship with an employee'}]}],
["Menurut McDonald's dalam satu kenyataan, tindakan tersebut diambil berikutan Easterbrook, 52, didakwa melanggar polisi syarikat, yang tidak membenarkan CEO mempunyai hubungan dengan mana-mana kakitangan syarikat.",
{'triplets': [{'subject': "McDonald's",
'predicate': 'statement',
'object': 'Tindakan diambil berikutan Easterbrook didakwa melanggar polisi syarikat yang tidak membenarkan CEO mempunyai hubungan dengan mana-mana kakitangan syarikat.'}]}],
["Susulan pemecatan tersebut, restoran terbesar dunia itu melantik bekas presiden McDonald's Amerika Syarikat (AS), Chris Kempczinski, sebagai CEO baharu berkuat kuasa serta-merta.",
{'triplets': [{'subject': 'restoran terbesar dunia',
'predicate': 'melantik',
'object': 'Chris Kempczinski'},
{'subject': 'restoran terbesar dunia',
'predicate': 'sebagai',
'object': 'CEO'},
{'subject': 'restoran terbesar dunia',
'predicate': 'berkuat kuasa',
'object': 'serta-merta'}]}],
['Sementara itu, Easterbrook menerusi emel kepada kakitangannya mengakui hubungan tersebut merupakan "satu kesilapan" yang bertentangan dengan dasar syarikat.',
{'triplets': [{'subject': 'Easterbrook',
'predicate': 'admits',
'object': 'relationship'},
{'subject': 'relationship', 'predicate': 'is', 'object': 'mistake'},
{'subject': 'relationship',
'predicate': 'contradicts',
'object': 'company policy'}]}],
['"Mengambil nilai syarikat ini, saya bersetuju untuk mengundurkan diri," demikian katanya.',
{'triplets': [{'subject': 'saya',
'predicate': 'mengambil',
'object': 'nilai syarikat ini'},
{'subject': 'saya',
'predicate': 'bersetuju',
'object': 'mengundurkan diri'}]}],
["Easterbrook pernah bercerai dan memulakan kerjaya dengan McDonald's pada tahun 1993 sebagai pengurus di London sebelum dinaikkan pangkat.",
{'triplets': [{'subject': 'Easterbrook',
'predicate': 'bercerai',
'object': 'true'},
{'subject': 'Easterbrook',
'predicate': 'memulakan kerjaya',
'object': "McDonald's"},
{'subject': 'Easterbrook', 'predicate': 'tahun', 'object': '1993'},
{'subject': 'Easterbrook', 'predicate': 'pengurus', 'object': 'London'},
{'subject': 'Easterbrook',
'predicate': 'dinaikkan pangkat',
'object': 'true'}]}],
["Beliau dilantik sebagai CEO McDonald's Corporation pada tahun 2015. -",
{'triplets': [{'subject': 'Beliau',
'predicate': 'dilantik sebagai',
'object': "CEO McDonald's Corporation"},
{'subject': 'Beliau', 'predicate': 'pada tahun', 'object': '2015'}]}]],
'title_kg_ms': [{'head': 'Padah',
'type': 'mempunyai',
'tail': 'hubungan sulit'},
{'head': 'hubungan sulit', 'type': 'dengan', 'tail': 'pekerja sendiri'},
{'head': 'Padah', 'type': 'dipecat', 'tail': "CEO McDonald's"}],
'description_kg_ms': [{'head': 'CEO',
'type': 'tidak boleh menjalin hubungan dengan',
'tail': 'kakitangan'}],
'body_kg_ms': [["SYARIKAT rantaian makanan segera terkemuka dunia, McDonald's Corp mengesahkan telah memecat Ketua Pegawai Eksekutif (CEO), Steve Easterbrook selepas menjalinkan hubungan sulit dengan salah seorang kakitangannya.",
[{'head': '',
'type': 'mengesahkan',
'tail': 'yang telah memecat Steve Easterbrook'},
{'head': 'Steve Easterbrook',
'type': 'telah',
'tail': 'hubungan yang tidak sesuai dengan pekerja'}]],
["Menurut McDonald's dalam satu kenyataan, tindakan tersebut diambil berikutan Easterbrook, 52, didakwa melanggar polisi syarikat, yang tidak membenarkan CEO mempunyai hubungan dengan mana-mana kakitangan syarikat.",
[]],
["Susulan pemecatan tersebut, restoran terbesar dunia itu melantik bekas presiden McDonald's Amerika Syarikat (AS), Chris Kempczinski, sebagai CEO baharu berkuat kuasa serta-merta.",
[{'head': '', 'type': 'melantik', 'tail': 'Chris Kempczinski'},
{'head': '', 'type': 'sebagai', 'tail': 'CEO'}]],
['Sementara itu, Easterbrook menerusi emel kepada kakitangannya mengakui hubungan tersebut merupakan "satu kesilapan" yang bertentangan dengan dasar syarikat.',
[{'head': 'Easterbrook', 'type': 'mengakui', 'tail': 'hubungan'},
{'head': 'hubungan', 'type': 'ialah', 'tail': 'kesilapan'},
{'head': 'hubungan', 'type': 'bercanggah', 'tail': 'dasar syarikat'}]],
['"Mengambil nilai syarikat ini, saya bersetuju untuk mengundurkan diri," demikian katanya.',
[{'head': 'Saya', 'type': 'mengambil', 'tail': 'nilai syarikat ini'},
{'head': 'Saya', 'type': 'bersetuju', 'tail': 'meletak jawatan'}]],
["Easterbrook pernah bercerai dan memulakan kerjaya dengan McDonald's pada tahun 1993 sebagai pengurus di London sebelum dinaikkan pangkat.",
[{'head': 'Easterbrook', 'type': 'bercerai', 'tail': 'benar'},
{'head': 'Easterbrook', 'type': 'memulakan kerjaya', 'tail': "McDonald's"},
{'head': 'Easterbrook', 'type': 'tahun', 'tail': '1993'},
{'head': 'Easterbrook', 'type': 'pengurus', 'tail': 'London'},
{'head': 'Easterbrook', 'type': 'dinaikkan pangkat', 'tail': 'benar'}]],
["Beliau dilantik sebagai CEO McDonald's Corporation pada tahun 2015. -",
[{'head': "Beliau adalah CEO McDonald's Corporation",
'type': 'pada tahun',
'tail': '2015'}]]]}
``` |
CyberHarem/px4_storm_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of px4_storm/Px4ストーム/Px4风暴 (Girls' Frontline)
This is the dataset of px4_storm/Px4ストーム/Px4风暴 (Girls' Frontline), containing 31 images and their tags.
The core tags of this character are `green_eyes, blonde_hair, breasts, bangs, large_breasts, mole_under_eye, mole, short_hair, hair_between_eyes, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 40.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 22.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 80 | 48.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 35.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 80 | 68.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/px4_storm_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, gloves, solo, hood_up, blush, looking_at_viewer, white_background, dress, character_name, handgun, black_coat, holding_gun, skindentation, thigh_strap, thighs |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, looking_at_viewer, navel, solo, white_bikini, cleavage, collarbone, hairclip, halterneck, simple_background, white_background, bare_legs, closed_mouth, feet, full_body, holding, o-ring_bikini, orange_hair, parted_lips, sandals, sarong, sky, smile, standing, stomach, thighs, toes, wet, white_footwear |
| 2 | 7 |  |  |  |  |  | 1girl, blush, red_sweater, smile, looking_at_viewer, solo, turtleneck, black_pantyhose, beret, earrings, necklace, panties, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | solo | hood_up | blush | looking_at_viewer | white_background | dress | character_name | handgun | black_coat | holding_gun | skindentation | thigh_strap | thighs | bare_shoulders | navel | white_bikini | cleavage | collarbone | hairclip | halterneck | simple_background | bare_legs | closed_mouth | feet | full_body | holding | o-ring_bikini | orange_hair | parted_lips | sandals | sarong | sky | smile | standing | stomach | toes | wet | white_footwear | red_sweater | turtleneck | black_pantyhose | beret | earrings | necklace | panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:----------|:--------|:--------------------|:-------------------|:--------|:-----------------|:----------|:-------------|:--------------|:----------------|:--------------|:---------|:-----------------|:--------|:---------------|:-----------|:-------------|:-----------|:-------------|:--------------------|:------------|:---------------|:-------|:------------|:----------|:----------------|:--------------|:--------------|:----------|:---------|:------|:--------|:-----------|:----------|:-------|:------|:-----------------|:--------------|:-------------|:------------------|:--------|:-----------|:-----------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | X | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X |
|
iambestfeed/vnexpress_hard_negative | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: hard_negative_bm25
sequence: string
splits:
- name: train
num_bytes: 3289748209
num_examples: 67785
download_size: 1736973192
dataset_size: 3289748209
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DFKI-SLT/wikitext_linked | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: wikitext_linked
size_categories:
- 1M<n<10M
source_datasets:
- extended|wikitext
task_categories:
- fill-mask
- token-classification
- text-classification
task_ids:
- masked-language-modeling
- named-entity-recognition
- part-of-speech
- lemmatization
- parsing
- entity-linking-classification
---
# Dataset Card for wikitext_linked
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** -
- **Repository:** [https://github.com/GabrielKP/svo/](https://github.com/GabrielKP/svo/)
- **Paper:** -
- **Leaderboard:** -
- **Point of Contact:** [gabriel.kressin@dfki.de](mailto:gabriel.kressin@dfki.de)
### Dataset Summary
The WikiText language modeling dataset is a collection of over 100 million tokens extracted from
the set of verified Good and Featured articles on Wikipedia. Dependency Relations, POS, NER tags
are marked with [trankit](https://github.com/nlp-uoregon/trankit), entities are linked with
[entity-fishing](https://nerd.readthedocs.io/en/latest/index.html), which also tags another field
of NER tags. The dataset is available under the Creative Commons Attribution-ShareAlike License.
Compared to the preprocessed version of Penn Treebank (PTB), WikiText-2 is over 2 times larger and
WikiText-103 is over 110 times larger. The WikiText dataset also features a far larger vocabulary
and retains the original case, punctuation and numbers - all of which are removed in PTB. As it is
composed of full articles, the dataset is well suited for models that can take advantage of long
term dependencies.
### Supported Tasks and Leaderboards
- masked-language-modeling
- named-entity-recognition
- part-of-speech
- lemmatization
- parsing
- entity-linking-classification
### Languages
English.
## Dataset Structure
### Data Instances
#### wikitext2
- **Size of downloaded dataset files:** 27.3 MB
- **Size of the generated dataset:** 197.2 MB
- **Total amount of disk used:** 197.2 MB
An example of 'validation' looks as follows.
```json
{
'text': 'It is closely related to the American lobster , H. americanus .',
'original_id': 3,
'tok_span': [[0, 0], [0, 2], [3, 5], [6, 13], [14, 21], [22, 24], [25, 28], [29, 37], [38, 45], [46, 47], [48, 50], [51, 61], [62, 63]],
'tok_upos': ['root', 'PRON', 'AUX', 'ADV', 'ADJ', 'ADP', 'DET', 'ADJ', 'NOUN', 'PUNCT', 'PROPN', 'PROPN', 'PUNCT'],
'tok_xpos': ['root', 'PRP', 'VBZ', 'RB', 'JJ', 'IN', 'DT', 'JJ', 'NN', ',', 'NNP', 'NNP', '.'],
'tok_dephead': [0, 4, 4, 4, 0, 8, 8, 8, 4, 8, 8, 10, 4],
'tok_deprel': ['root', 'nsubj', 'cop', 'advmod', 'root', 'case', 'det', 'amod', 'obl', 'punct', 'appos', 'flat', 'punct'],
'tok_lemma': [None, 'it', 'be', 'closely', 'related', 'to', 'the', 'american', 'lobster', ',', 'H.', 'americanus', '.'],
'tok_ner': [None, 'O', 'O', 'O', 'O', 'O', 'O', 'S-MISC', 'O', 'O', 'O', 'O', 'O'],
'ent_span': [[29, 45]],
'ent_wikipedia_external_ref': ['377397'],
'ent_ner': [None],
'ent_domains': [['Enterprise']],
}
```
#### wikitext103
- **Size of downloaded dataset files:** 1.11 GB
- **Size of the generated dataset:** 7.82 GB
- **Total amount of disk used:** 7.82 GB
An example of 'train' looks as follows.
```json
{
'text': 'Vision for the PlayStation Portable .',
'original_id': 3,
'tok_span': [[0, 0], [0, 6], [7, 10], [11, 14], [15, 26], [27, 35], [36, 37]],
'tok_upos': ['root', 'NOUN', 'ADP', 'DET', 'PROPN', 'PROPN', 'PUNCT'],
'tok_xpos': ['root', 'NN', 'IN', 'DT', 'NNP', 'NNP', '.'],
'tok_dephead': [0, 0, 5, 5, 5, 1, 1],
'tok_deprel': ['root', 'root', 'case', 'det', 'compound', 'nmod', 'punct'],
'tok_lemma': [None, 'vision', 'for', 'the', 'PlayStation', 'Portable', '.'],
'tok_ner': [None, 'O', 'O', 'O', 'B-MISC', 'E-MISC', 'O'],
'ent_span': [[15, 35]],
'ent_wikipedia_external_ref': ['619009'],
'ent_ner': [None],
'ent_domains': [['Electronics', 'Computer_Science']]
}
```
Use following code to print the examples nicely:
```py
def print_tokens_entities(example):
text = example['text']
print(
"Text:\n"
f" {text}"
"\nOrig-Id: "
f"{example['original_id']}"
"\nTokens:"
)
iterator = enumerate(zip(
example["tok_span"],
example["tok_upos"],
example["tok_xpos"],
example["tok_ner"],
example["tok_dephead"],
example["tok_deprel"],
example["tok_lemma"],
))
print(f" Id | {'token':12} | {'upos':8} | {'xpos':8} | {'ner':8} | {'deph':4} | {'deprel':9} | {'lemma':12} | Id")
print("---------------------------------------------------------------------------------------------------")
for idx, (tok_span, upos, xpos, ner, dephead, deprel, lemma) in iterator:
print(f" {idx:3} | {text[tok_span[0]:tok_span[1]]:12} | {upos:8} | {xpos:8} | {str(ner):8} | {str(dephead):4} | {deprel:9} | {str(lemma):12} | {idx}")
iterator = list(enumerate(zip(
example.get("ent_span", []),
example.get("ent_wikipedia_external_ref", []),
example.get("ent_ner", []),
example.get("ent_domains", []),
)))
if len(iterator) > 0:
print("Entities")
print(f" Id | {'entity':21} | {'wiki_ref':7} | {'ner':7} | domains")
print("--------------------------------------------------------------------")
for idx, ((start, end), wiki_ref, ent_ner, ent_domains) in iterator:
print(f" {idx:3} | {text[start:end]:21} | {str(wiki_ref):7} | {str(ent_ner):7} | {ent_domains}")
```
### Data Fields
The data fields are the same among all splits.
* text: string feature.
* original_id: int feature. Mapping to index within original wikitext dataset.
* tok_span: sequence of (int, int) tuples. Denotes token spans (start inclusive, end exclusive)
within each sentence.
**Note that each sentence includes an artificial root node to align dependency relations.**
* tok_upos: string feature. [Universal Dependency POS tag](https://universaldependencies.org/)
tags. Aligned with tok_span. Root node has tag "root".
* tok_xpos: string geature. [XPOS POS tag](https://trankit.readthedocs.io/en/latest/overview.html#token-list).
Aligned with tok_span. Root node has tag "root".
* tok_dephead: int feature.
[Universal Dependency Head Node](https://universaldependencies.org/introduction.html). Int refers
to tokens in tok_span. Root node has head `0` (itself).
* tok_deprel: [Universal Dependency Relation Description](https://universaldependencies.org/introduction.html).
Refers to the relation between this token and head token. Aligned with tok_span. Root node has
dependency relation "root" to itself.
* tok_lemma: string feature. Lemma of token. Aligend with tok_span.
* tok_ner: string feature. NER tag of token. Marked in BIOS schema (e.g. S-MISC, B-LOC, ...)
Aligned with tok_span. Root node has NER tag `None`.
* ent_span: sequence of (int, int) tuples. Denotes entities found by entity-fishing
(start inclusive, end exclusive).
* ent_wikipedia_external_ref: string feature. External Reference to wikipedia page. You can
access the wikipedia page via the url `https://en.wikipedia.org/wiki?curid=<ent_wikipedia_external_ref>`.
Aligend with ent_span. All entities either have this field, or the `ent_ner` field, but not both.
An empty field is denoted by the string `None`. Aligned with ent_span.
* ent_ner: string feature. Denotes NER tags. An empty field is denoted by the string `None`.
Aligned with ent_span.
"ent_domains": sequence of string. Denotes domains of entity. Can be empty sequence. Aligned with
ent_span.
### Data Splits
| name | train |validation| test|
|-------------------|------:|---------:|----:|
|wikitext103 |4076530| 8607|10062|
|wikitext2 | 82649| 8606|10062|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[https://huggingface.co/datasets/wikitext](https://huggingface.co/datasets/wikitext)
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
1. Started with `wikitext2-raw-v1` and `wikitext103-raw-v1` from [wikitext](https://huggingface.co/datasets/wikitext)
2. Ran datasets through Trankit. Marked all fields starting with `tok`.
In this step, the texts have been split into sentences. To retain the original text sections
you can accumulate over `original_id` (examples are in order).
3. Ran datasets through entity-fishing. Marked all fields starting with `ent`.
#### Who are the annotators?
Machines powered by [DFKI](https://www.dfki.de/web).
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)
### Citation Information
Please cite the original creators of wikitext, and the great people
developing trankit and entity-fishing.
```
@misc{merity2016pointer,
title={Pointer Sentinel Mixture Models},
author={Stephen Merity and Caiming Xiong and James Bradbury and Richard Socher},
year={2016},
eprint={1609.07843},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{nguyen2021trankit,
title={Trankit: A Light-Weight Transformer-based Toolkit for Multilingual Natural Language Processing},
author={Nguyen, Minh Van and Lai, Viet Dac and Veyseh, Amir Pouran Ben and Nguyen, Thien Huu},
booktitle="Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations",
year={2021}
}
@misc{entity-fishing,
title = {entity-fishing},
howpublished = {\\url{https://github.com/kermitt2/entity-fishing}},
publisher = {GitHub},
year = {2016--2022},
archivePrefix = {swh},
eprint = {1:dir:cb0ba3379413db12b0018b7c3af8d0d2d864139c}
}
```
### Contributions
Thanks to [@GabrielKP](https://github.com/GabrielKP) for adding this dataset.
|
open-llm-leaderboard/details_Undi95__ReMM-SLERP-L2-13B | ---
pretty_name: Evaluation run of Undi95/ReMM-SLERP-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/ReMM-SLERP-L2-13B](https://huggingface.co/Undi95/ReMM-SLERP-L2-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-SLERP-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T14:55:07.909290](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-SLERP-L2-13B/blob/main/results_2023-10-29T14-55-07.909290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13464765100671142,\n\
\ \"em_stderr\": 0.0034957110748356193,\n \"f1\": 0.20755138422818709,\n\
\ \"f1_stderr\": 0.0036341951060626636,\n \"acc\": 0.421953322606337,\n\
\ \"acc_stderr\": 0.01004266408410234\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.13464765100671142,\n \"em_stderr\": 0.0034957110748356193,\n\
\ \"f1\": 0.20755138422818709,\n \"f1_stderr\": 0.0036341951060626636\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \
\ \"acc_stderr\": 0.00795094214833933\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/ReMM-SLERP-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|arc:challenge|25_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T14_55_07.909290
path:
- '**/details_harness|drop|3_2023-10-29T14-55-07.909290.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T14-55-07.909290.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T14_55_07.909290
path:
- '**/details_harness|gsm8k|5_2023-10-29T14-55-07.909290.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T14-55-07.909290.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hellaswag|10_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T13-42-48.770616.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-06T13-42-48.770616.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-06T13-42-48.770616.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T14_55_07.909290
path:
- '**/details_harness|winogrande|5_2023-10-29T14-55-07.909290.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T14-55-07.909290.parquet'
- config_name: results
data_files:
- split: 2023_09_06T13_42_48.770616
path:
- results_2023-09-06T13-42-48.770616.parquet
- split: 2023_10_29T14_55_07.909290
path:
- results_2023-10-29T14-55-07.909290.parquet
- split: latest
path:
- results_2023-10-29T14-55-07.909290.parquet
---
# Dataset Card for Evaluation run of Undi95/ReMM-SLERP-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-SLERP-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-SLERP-L2-13B](https://huggingface.co/Undi95/ReMM-SLERP-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-SLERP-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T14:55:07.909290](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-SLERP-L2-13B/blob/main/results_2023-10-29T14-55-07.909290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13464765100671142,
"em_stderr": 0.0034957110748356193,
"f1": 0.20755138422818709,
"f1_stderr": 0.0036341951060626636,
"acc": 0.421953322606337,
"acc_stderr": 0.01004266408410234
},
"harness|drop|3": {
"em": 0.13464765100671142,
"em_stderr": 0.0034957110748356193,
"f1": 0.20755138422818709,
"f1_stderr": 0.0036341951060626636
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.00795094214833933
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.01213438601986535
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BeIR/webis-touche2020-qrels | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.