datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
viola77data/recycling-dataset | ---
annotations_creators: []
language:
- en
language_creators:
- crowdsourced
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: recycling-dataset
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- recycling
- image-classification
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
# Dataset Card for recycling-dataset
### Dataset Summary
This is a recycling dataset that can be used for image classification. It has 11 categories:
- aluminium
- batteries
- cardboard
- disposable plates
- glass
- hard plastic
- paper
- paper towel
- polystyrene
- soft plastics
- takeaway cups
It was scrapped from DuckDuckGo using this tool: https://pypi.org/project/jmd-imagescraper/
|
davidberenstein1957/stackoverflow_feedback_demo | ---
dataset_info:
features:
- name: metadata
dtype: string
- name: title
dtype: string
id: field
- name: question
dtype: string
id: field
- name: answer
dtype: string
id: field
- name: title_question_fit
sequence:
- name: user_id
dtype: string
- name: value
dtype: string
- name: status
dtype: string
id: question
- name: tags
sequence:
- name: user_id
dtype: string
- name: value
sequence: string
- name: status
dtype: string
id: question
- name: answer_quality
sequence:
- name: user_id
dtype: string
- name: value
dtype: int32
- name: status
dtype: string
id: question
- name: new_answer
sequence:
- name: user_id
dtype: string
- name: value
dtype: string
- name: status
dtype: string
id: question
- name: external_id
dtype: string
id: external_id
splits:
- name: train
num_bytes: 327656
num_examples: 200
download_size: 200001
dataset_size: 327656
---
# Dataset Card for "stackoverflow_feedback_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aaditya/alpaca_subset_2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 444296.5943617553
num_examples: 500
download_size: 234786
dataset_size: 444296.5943617553
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Seongill/NQ_conflict_5_half | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: substitute
dtype: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: is_conflict
dtype: bool
- name: num_replace
dtype: int64
- name: num_answer
dtype: int64
splits:
- name: train
num_bytes: 12189834
num_examples: 3610
download_size: 7217475
dataset_size: 12189834
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yezhengli9/wmt20-fr-de | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 509387
num_examples: 1619
download_size: 281586
dataset_size: 509387
---
# Dataset Card for "wmt20-fr-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Farisya/ft-poc | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 91347
num_examples: 87
- name: test
num_bytes: 10667
num_examples: 10
download_size: 33780
dataset_size: 102014
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
iarejula/python-pb | ---
language:
- es
pretty_name: "Pretty Name of the Dataset"
tags:
- tag1
- tag2
license: "mit"
---
# Hola
|
OrdalieTech/Ordalie-FR-STS-benchmark | ---
language:
- fr
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- feature-extraction
pretty_name: ordalie-fr-sts-benchmark
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 14934570
num_examples: 10000
download_size: 9328832
dataset_size: 14934570
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Ordalie - French STS Benchmark
- 30k sentence pairs
- Score either 0 or 1 |
japanese-asr/whisper_transcriptions.reazonspeech.all_24 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30455876258.0
num_examples: 267716
download_size: 30213989898
dataset_size: 30455876258.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
Jayem-11/mozilla_commonvoice_hackathon_preprocessed_train_batch_1 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: input_length
dtype: int64
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
- name: labels_length
dtype: int64
splits:
- name: train
num_bytes: 15582750752.60228
num_examples: 13687
download_size: 4763193239
dataset_size: 15582750752.60228
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mozilla_commonvoice_hackathon_preprocessed_train_batch_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/aozaki_touko_karanokyoukai | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Aozaki Touko
This is the dataset of Aozaki Touko, containing 156 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 156 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 338 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 156 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 156 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 156 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 156 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 156 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 338 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 338 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 338 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
AnkitSatpute/zbm_top1000_ttv_str | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4202798
num_examples: 136517
- name: test
num_bytes: 4248762
num_examples: 136566
- name: validation
num_bytes: 1665826
num_examples: 56172
download_size: 2838212
dataset_size: 10117386
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_present_for_exp_perfect | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3411
num_examples: 25
- name: test
num_bytes: 8933
num_examples: 59
- name: train
num_bytes: 132207
num_examples: 1071
download_size: 74185
dataset_size: 144551
---
# Dataset Card for "MULTI_VALUE_sst2_present_for_exp_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/flickr_humans_20k_vangogh | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 11042860551.0
num_examples: 20000
download_size: 11043090672
dataset_size: 11042860551.0
---
# Dataset Card for "flickr_humans_20k_vangogh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-tak-stack-dpo | ---
pretty_name: Evaluation run of CorticalStack/neurotic-crown-clown-7b-tak-stack-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CorticalStack/neurotic-crown-clown-7b-tak-stack-dpo](https://huggingface.co/CorticalStack/neurotic-crown-clown-7b-tak-stack-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-tak-stack-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T00:11:38.696466](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-tak-stack-dpo/blob/main/results_2024-03-01T00-11-38.696466.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6515965796991485,\n\
\ \"acc_stderr\": 0.03205501969270813,\n \"acc_norm\": 0.6510126737668477,\n\
\ \"acc_norm_stderr\": 0.032724191390382,\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7836542325930044,\n\
\ \"mc2_stderr\": 0.013659438976980564\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.01336308010724448,\n\
\ \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7053375821549492,\n\
\ \"acc_stderr\": 0.004549591490046208,\n \"acc_norm\": 0.8872734515036845,\n\
\ \"acc_norm_stderr\": 0.0031561189647529367\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530333,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752597,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752597\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066309,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066309\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46145251396648046,\n\
\ \"acc_stderr\": 0.01667273126755226,\n \"acc_norm\": 0.46145251396648046,\n\
\ \"acc_norm_stderr\": 0.01667273126755226\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823694,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823694\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827058,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827058\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7836542325930044,\n\
\ \"mc2_stderr\": 0.013659438976980564\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292404\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \
\ \"acc_stderr\": 0.01257939823558952\n }\n}\n```"
repo_url: https://huggingface.co/CorticalStack/neurotic-crown-clown-7b-tak-stack-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-11-38.696466.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-11-38.696466.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- '**/details_harness|winogrande|5_2024-03-01T00-11-38.696466.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T00-11-38.696466.parquet'
- config_name: results
data_files:
- split: 2024_03_01T00_11_38.696466
path:
- results_2024-03-01T00-11-38.696466.parquet
- split: latest
path:
- results_2024-03-01T00-11-38.696466.parquet
---
# Dataset Card for Evaluation run of CorticalStack/neurotic-crown-clown-7b-tak-stack-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/neurotic-crown-clown-7b-tak-stack-dpo](https://huggingface.co/CorticalStack/neurotic-crown-clown-7b-tak-stack-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-tak-stack-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T00:11:38.696466](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__neurotic-crown-clown-7b-tak-stack-dpo/blob/main/results_2024-03-01T00-11-38.696466.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6515965796991485,
"acc_stderr": 0.03205501969270813,
"acc_norm": 0.6510126737668477,
"acc_norm_stderr": 0.032724191390382,
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7836542325930044,
"mc2_stderr": 0.013659438976980564
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.01336308010724448,
"acc_norm": 0.7244027303754266,
"acc_norm_stderr": 0.01305716965576184
},
"harness|hellaswag|10": {
"acc": 0.7053375821549492,
"acc_stderr": 0.004549591490046208,
"acc_norm": 0.8872734515036845,
"acc_norm_stderr": 0.0031561189647529367
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530333,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752597,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752597
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066309,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066309
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46145251396648046,
"acc_stderr": 0.01667273126755226,
"acc_norm": 0.46145251396648046,
"acc_norm_stderr": 0.01667273126755226
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823694,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823694
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827058,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827058
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7836542325930044,
"mc2_stderr": 0.013659438976980564
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292404
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.01257939823558952
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dog/fuego-20230215-094051-1a615e | ---
tags:
- fuego
fuego:
id: 20230215-094051-1a615e
status: done
script: run.py
requirements_file: requirements.txt
space_id: dog/actlearn-fuego-runner
space_hardware: cpu-basic
---
|
Multimodal-Fatima/Food101_5samples_class_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple pie
'1': baby back ribs
'2': baklava
'3': beef carpaccio
'4': beef tartare
'5': beet salad
'6': beignets
'7': bibimbap
'8': bread pudding
'9': breakfast burrito
'10': bruschetta
'11': caesar salad
'12': cannoli
'13': caprese salad
'14': carrot cake
'15': ceviche
'16': cheesecake
'17': cheese plate
'18': chicken curry
'19': chicken quesadilla
'20': chicken wings
'21': chocolate cake
'22': chocolate mousse
'23': churros
'24': clam chowder
'25': club sandwich
'26': crab cakes
'27': creme brulee
'28': croque madame
'29': cup cakes
'30': deviled eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs benedict
'35': escargots
'36': falafel
'37': filet mignon
'38': fish and chips
'39': foie gras
'40': french fries
'41': french onion soup
'42': french toast
'43': fried calamari
'44': fried rice
'45': frozen yogurt
'46': garlic bread
'47': gnocchi
'48': greek salad
'49': grilled cheese sandwich
'50': grilled salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot and sour soup
'55': hot dog
'56': huevos rancheros
'57': hummus
'58': ice cream
'59': lasagna
'60': lobster bisque
'61': lobster roll sandwich
'62': macaroni and cheese
'63': macarons
'64': miso soup
'65': mussels
'66': nachos
'67': omelette
'68': onion rings
'69': oysters
'70': pad thai
'71': paella
'72': pancakes
'73': panna cotta
'74': peking duck
'75': pho
'76': pizza
'77': pork chop
'78': poutine
'79': prime rib
'80': pulled pork sandwich
'81': ramen
'82': ravioli
'83': red velvet cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed salad
'89': shrimp and grits
'90': spaghetti bolognese
'91': spaghetti carbonara
'92': spring rolls
'93': steak
'94': strawberry shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna tartare
'100': waffles
- name: Attributes_ViT_L_14_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003_food101
sequence: string
- name: clip_tags_ViT_L_14_with_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_wo_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_ViT_L_14_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_16_simple_specific
dtype: string
- name: clip_tags_ViT_B_16_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_32_simple_specific
dtype: string
- name: clip_tags_ViT_B_32_ensemble_specific
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_ViT_B_16_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
- name: id
dtype: int64
splits:
- name: test
num_bytes: 25787125.0
num_examples: 505
download_size: 24766110
dataset_size: 25787125.0
---
# Dataset Card for "Food101_5samples_class_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lucyshi/language-dagger-bag | ---
license: cc-by-4.0
---
|
tinhpx2911/vietai_book_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8740094801
num_examples: 15189
download_size: 4515817258
dataset_size: 8740094801
---
# Dataset Card for "vietai_book_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Carlosgg14/kurapika | ---
license: openrail
---
|
sanchit-gandhi/common_voice_13_0_hi_pseudo_labelled | ---
dataset_info:
config_name: hi
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 133453462.934
num_examples: 4479
- name: validation
num_bytes: 67346656.935
num_examples: 2281
- name: test
num_bytes: 102696067.039
num_examples: 2947
download_size: 269383712
dataset_size: 303496186.908
configs:
- config_name: hi
data_files:
- split: train
path: hi/train-*
- split: validation
path: hi/validation-*
- split: test
path: hi/test-*
---
# Dataset Card for "common_voice_13_0_hi_pseudo_labelled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
astha/replicationpackage | ---
license: mit
---
|
open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter | ---
pretty_name: Evaluation run of mosaicml/mpt-7b-storywriter
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mosaicml/mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T08:53:05.263222](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter/blob/main/results_2023-10-16T08-53-05.263222.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.00025680027497237983,\n \"f1\": 0.0032026006711409396,\n\
\ \"f1_stderr\": 0.0005040610386397096,\n \"acc\": 0.2557221783741121,\n\
\ \"acc_stderr\": 0.0070244020999296625\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.00025680027497237983,\n\
\ \"f1\": 0.0032026006711409396,\n \"f1_stderr\": 0.0005040610386397096\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5114443567482242,\n\
\ \"acc_stderr\": 0.014048804199859325\n }\n}\n```"
repo_url: https://huggingface.co/mosaicml/mpt-7b-storywriter
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|arc:challenge|25_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_18_47.960530
path:
- '**/details_harness|drop|3_2023-09-22T15-18-47.960530.parquet'
- split: 2023_10_16T08_53_05.263222
path:
- '**/details_harness|drop|3_2023-10-16T08-53-05.263222.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T08-53-05.263222.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_18_47.960530
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-18-47.960530.parquet'
- split: 2023_10_16T08_53_05.263222
path:
- '**/details_harness|gsm8k|5_2023-10-16T08-53-05.263222.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T08-53-05.263222.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hellaswag|10_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:23:53.118062.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-20T10:23:53.118062.parquet'
- split: 2023_10_03T22_53_23.133729
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-53-23.133729.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T22-53-23.133729.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_18_47.960530
path:
- '**/details_harness|winogrande|5_2023-09-22T15-18-47.960530.parquet'
- split: 2023_10_16T08_53_05.263222
path:
- '**/details_harness|winogrande|5_2023-10-16T08-53-05.263222.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T08-53-05.263222.parquet'
- config_name: results
data_files:
- split: 2023_07_20T10_23_53.118062
path:
- results_2023-07-20T10:23:53.118062.parquet
- split: 2023_09_22T15_18_47.960530
path:
- results_2023-09-22T15-18-47.960530.parquet
- split: 2023_10_03T22_53_23.133729
path:
- results_2023-10-03T22-53-23.133729.parquet
- split: 2023_10_16T08_53_05.263222
path:
- results_2023-10-16T08-53-05.263222.parquet
- split: latest
path:
- results_2023-10-16T08-53-05.263222.parquet
---
# Dataset Card for Evaluation run of mosaicml/mpt-7b-storywriter
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mosaicml/mpt-7b-storywriter
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mosaicml/mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T08:53:05.263222](https://huggingface.co/datasets/open-llm-leaderboard/details_mosaicml__mpt-7b-storywriter/blob/main/results_2023-10-16T08-53-05.263222.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0006291946308724832,
"em_stderr": 0.00025680027497237983,
"f1": 0.0032026006711409396,
"f1_stderr": 0.0005040610386397096,
"acc": 0.2557221783741121,
"acc_stderr": 0.0070244020999296625
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.00025680027497237983,
"f1": 0.0032026006711409396,
"f1_stderr": 0.0005040610386397096
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5114443567482242,
"acc_stderr": 0.014048804199859325
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
caiosoares26/vozdocoxinhz | ---
license: openrail
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_232 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1208067668.0
num_examples: 235399
download_size: 1239043662
dataset_size: 1208067668.0
---
# Dataset Card for "chunk_232"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/alan-walker | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/alan-walker"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.269381 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/70b44d7b5a4be028e87b865dd425a4cc.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/alan-walker">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Alan Walker</div>
<a href="https://genius.com/artists/alan-walker">
<div style="text-align: center; font-size: 14px;">@alan-walker</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/alan-walker).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/alan-walker")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|206| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/alan-walker")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
OliverYoung/codellama-threejs | ---
license: mit
---
|
Sleoruiz/discursos_balanceados_con_etiqueta | ---
dataset_info:
features:
- name: text
dtype: string
- name: name
dtype: string
- name: comision
dtype: string
- name: gaceta_numero
dtype: string
- name: fecha_gaceta
dtype: string
- name: labels
sequence: string
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 8237005
num_examples: 2242
download_size: 4342800
dataset_size: 8237005
---
# Dataset Card for "discursos_balanceados_con_etiqueta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roszcz/giant-midi-sustain | ---
dataset_info:
features:
- name: notes
struct:
- name: duration
sequence: float64
- name: end
sequence: float64
- name: pitch
sequence: int64
- name: start
sequence: float64
- name: velocity
sequence: int64
- name: midi_filename
dtype: string
splits:
- name: train
num_bytes: 1548922542
num_examples: 10853
download_size: 483630029
dataset_size: 1548922542
---
# Dataset Card for "giant-midi-sustain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mousse_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mousse/ムース/慕斯 (Arknights)
This is the dataset of mousse/ムース/慕斯 (Arknights), containing 148 images and their tags.
The core tags of this character are `animal_ears, cat_ears, multicolored_hair, green_eyes, short_hair, white_hair, hat, cat_girl, two-tone_hair, black_headwear, tail, cat_tail, blonde_hair, brown_hair, animal_ear_fluff, mini_hat, multiple_tails, two_tails, orange_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 148 | 209.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mousse_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 148 | 181.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mousse_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 357 | 358.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mousse_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mousse_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, black_necktie, collared_shirt, long_sleeves, solo, white_shirt, black_jacket, open_mouth, upper_body, black_gloves, blush, fingerless_gloves, looking_at_viewer, skin_fang, holding_cat, simple_background, :d, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, black_jacket, black_necktie, looking_at_viewer, simple_background, solo, white_background, white_shirt, closed_mouth, collared_shirt, portrait, blush, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_necktie | collared_shirt | long_sleeves | solo | white_shirt | black_jacket | open_mouth | upper_body | black_gloves | blush | fingerless_gloves | looking_at_viewer | skin_fang | holding_cat | simple_background | :d | white_background | closed_mouth | portrait | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-----------------|:---------------|:-------|:--------------|:---------------|:-------------|:-------------|:---------------|:--------|:--------------------|:--------------------|:------------|:--------------|:--------------------|:-----|:-------------------|:---------------|:-----------|:--------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | X | X | | X | | X | | X | | | X | | X | X | X | X |
|
jlbaker361/flickr_humans_dim_128_0.5k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 14606865.0
num_examples: 500
download_size: 14589079
dataset_size: 14606865.0
---
# Dataset Card for "flickr_humans_dim_128_0.5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mertllc/test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 10050421.0
num_examples: 500
download_size: 9992979
dataset_size: 10050421.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AthenaAgent/MockingBirdv1-SFT | ---
license: mit
---
|
DialogueCharacter/english_dialogue_instruction_with_reward_score_judged_by_13B_llama2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: reward_score
dtype: float64
splits:
- name: train
num_bytes: 888623949
num_examples: 909740
download_size: 475765484
dataset_size: 888623949
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dialogue_instruction_with_reward_score_judged_by_13B_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_zero_plural_after_quantifier | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 243206
num_examples: 987
- name: dev_mismatched
num_bytes: 230693
num_examples: 931
- name: test_matched
num_bytes: 221767
num_examples: 927
- name: test_mismatched
num_bytes: 218929
num_examples: 903
- name: train
num_bytes: 9040951
num_examples: 36832
download_size: 6244995
dataset_size: 9955546
---
# Dataset Card for "MULTI_VALUE_mnli_zero_plural_after_quantifier"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Sao10K__14B-Glacier-Stack | ---
pretty_name: Evaluation run of Sao10K/14B-Glacier-Stack
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/14B-Glacier-Stack](https://huggingface.co/Sao10K/14B-Glacier-Stack) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__14B-Glacier-Stack\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T12:10:05.722795](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__14B-Glacier-Stack/blob/main/results_2024-03-07T12-10-05.722795.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6686145157496808,\n\
\ \"acc_stderr\": 0.03159509998777849,\n \"acc_norm\": 0.671765635485992,\n\
\ \"acc_norm_stderr\": 0.032228278305947135,\n \"mc1\": 0.5030599755201959,\n\
\ \"mc1_stderr\": 0.017503173260960625,\n \"mc2\": 0.6537435659083609,\n\
\ \"mc2_stderr\": 0.015546800478831346\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.01359243151906808,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7013543118900617,\n\
\ \"acc_stderr\": 0.0045672877757005625,\n \"acc_norm\": 0.8834893447520414,\n\
\ \"acc_norm_stderr\": 0.003201805872737069\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.0365634365335316,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.0365634365335316\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.031639106653672915,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.031639106653672915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370334,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370334\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604548,\n \"\
acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604548\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343336,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343336\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377274,\n \
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377274\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.726890756302521,\n \"acc_stderr\": 0.028942004040998167,\n \
\ \"acc_norm\": 0.726890756302521,\n \"acc_norm_stderr\": 0.028942004040998167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8697247706422019,\n \"acc_stderr\": 0.014431862852473262,\n \"\
acc_norm\": 0.8697247706422019,\n \"acc_norm_stderr\": 0.014431862852473262\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.02991858670779883,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.02991858670779883\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579834,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579834\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5251396648044693,\n\
\ \"acc_stderr\": 0.01670135084268263,\n \"acc_norm\": 0.5251396648044693,\n\
\ \"acc_norm_stderr\": 0.01670135084268263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.0242886194660461,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.0242886194660461\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023132376234543325,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023132376234543325\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5078226857887875,\n\
\ \"acc_stderr\": 0.012768673076111898,\n \"acc_norm\": 0.5078226857887875,\n\
\ \"acc_norm_stderr\": 0.012768673076111898\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887657,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887657\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5030599755201959,\n\
\ \"mc1_stderr\": 0.017503173260960625,\n \"mc2\": 0.6537435659083609,\n\
\ \"mc2_stderr\": 0.015546800478831346\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5261561789234268,\n \
\ \"acc_stderr\": 0.013753627037255047\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/14B-Glacier-Stack
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-10-05.722795.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T12-10-05.722795.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- '**/details_harness|winogrande|5_2024-03-07T12-10-05.722795.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T12-10-05.722795.parquet'
- config_name: results
data_files:
- split: 2024_03_07T12_10_05.722795
path:
- results_2024-03-07T12-10-05.722795.parquet
- split: latest
path:
- results_2024-03-07T12-10-05.722795.parquet
---
# Dataset Card for Evaluation run of Sao10K/14B-Glacier-Stack
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/14B-Glacier-Stack](https://huggingface.co/Sao10K/14B-Glacier-Stack) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__14B-Glacier-Stack",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T12:10:05.722795](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__14B-Glacier-Stack/blob/main/results_2024-03-07T12-10-05.722795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6686145157496808,
"acc_stderr": 0.03159509998777849,
"acc_norm": 0.671765635485992,
"acc_norm_stderr": 0.032228278305947135,
"mc1": 0.5030599755201959,
"mc1_stderr": 0.017503173260960625,
"mc2": 0.6537435659083609,
"mc2_stderr": 0.015546800478831346
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.01359243151906808,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7013543118900617,
"acc_stderr": 0.0045672877757005625,
"acc_norm": 0.8834893447520414,
"acc_norm_stderr": 0.003201805872737069
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.0365634365335316,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.0365634365335316
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370334,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370334
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5105820105820106,
"acc_stderr": 0.02574554227604548,
"acc_norm": 0.5105820105820106,
"acc_norm_stderr": 0.02574554227604548
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343336,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.02329088805377274,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.02329088805377274
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.726890756302521,
"acc_stderr": 0.028942004040998167,
"acc_norm": 0.726890756302521,
"acc_norm_stderr": 0.028942004040998167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8697247706422019,
"acc_stderr": 0.014431862852473262,
"acc_norm": 0.8697247706422019,
"acc_norm_stderr": 0.014431862852473262
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801584,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801584
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.02991858670779883,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.02991858670779883
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579834,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5251396648044693,
"acc_stderr": 0.01670135084268263,
"acc_norm": 0.5251396648044693,
"acc_norm_stderr": 0.01670135084268263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.0242886194660461,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.0242886194660461
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998482,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023132376234543325,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023132376234543325
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5078226857887875,
"acc_stderr": 0.012768673076111898,
"acc_norm": 0.5078226857887875,
"acc_norm_stderr": 0.012768673076111898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887657,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887657
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5030599755201959,
"mc1_stderr": 0.017503173260960625,
"mc2": 0.6537435659083609,
"mc2_stderr": 0.015546800478831346
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.5261561789234268,
"acc_stderr": 0.013753627037255047
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ThiagoSantosLK/Samuel1 | ---
license: openrail
---
|
gagan3012/MMarcoRerankingEn2Ar | ---
dataset_info:
features:
- name: query
dtype: string
- name: negative
sequence: string
- name: positive
sequence: string
splits:
- name: test
num_bytes: 526997
num_examples: 100
download_size: 213946
dataset_size: 526997
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
CyberHarem/shigure_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shigure/間宵シグレ/时雨 (Blue Archive)
This is the dataset of shigure/間宵シグレ/时雨 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `animal_ears, halo, weasel_ears, purple_eyes, green_hair, breasts, short_hair, hair_between_eyes, blue_halo, tail, weasel_tail, medium_breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 959.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 782.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1321 | 1.62 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shigure_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, bath_yukata, cleavage, collarbone, looking_at_viewer, official_alternate_costume, simple_background, solo, white_background, blush, smile, upper_body, bare_shoulders, closed_mouth, off_shoulder, open_mouth |
| 1 | 11 |  |  |  |  |  | 1girl, bath_yukata, blush, cleavage, holding_cup, looking_at_viewer, official_alternate_costume, simple_background, smile, solo, white_background, collarbone, open_mouth, upper_body, yagasuri, drunk, bare_shoulders, grey_kimono, obi, off_shoulder |
| 2 | 10 |  |  |  |  |  | 1girl, bath_yukata, looking_at_viewer, official_alternate_costume, smile, solo, blush, cleavage, collarbone, open_mouth, simple_background, white_background, grey_kimono, yagasuri, holding, sitting, sash |
| 3 | 5 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, official_alternate_costume, smile, solo, wide_sleeves, bath_yukata, cleavage, grey_kimono, holding_bottle, obi, open_mouth, tokkuri, bare_shoulders, long_sleeves, off_shoulder, snow, teeth |
| 4 | 13 |  |  |  |  |  | 1girl, bath_yukata, holding_fan, looking_at_viewer, official_alternate_costume, solo, uchiwa, blush, smile, obi, white_background, open_mouth, simple_background, wide_sleeves, long_sleeves, yagasuri, collarbone, upper_body |
| 5 | 8 |  |  |  |  |  | 1girl, blush, cleavage, collarbone, holding_cup, looking_at_viewer, naked_towel, onsen, open_mouth, partially_submerged, smile, solo, water, bare_shoulders, tokkuri, white_towel, blue_eyes, outdoors, sitting, snow, upper_body, wet |
| 6 | 8 |  |  |  |  |  | 1girl, blush, nipples, 1boy, hetero, penis, solo_focus, bar_censor, naked_kimono, official_alternate_costume, open_mouth, smile, spread_legs, weasel_girl, yukata, after_sex, after_vaginal, cum_in_pussy, cumdrip, sweat, testicles, erection, girl_on_top, mosaic_censoring, small_breasts |
| 7 | 5 |  |  |  |  |  | 1girl, blush, fur_trim, grey_gloves, grey_headwear, hairclip, hat, holding, jacket, simple_background, solo, upper_body, white_background, looking_at_viewer, open_mouth, smile, long_sleeves, pink_eyes, capelet, flask, multicolored_eyes |
| 8 | 5 |  |  |  |  |  | 1girl, grey_gloves, grey_headwear, grey_jacket, hairclip, hat, looking_at_viewer, solo, upper_body, blush, closed_mouth, fur_trim, simple_background, white_background, smile, flask, grey_coat, holding, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bath_yukata | cleavage | collarbone | looking_at_viewer | official_alternate_costume | simple_background | solo | white_background | blush | smile | upper_body | bare_shoulders | closed_mouth | off_shoulder | open_mouth | holding_cup | yagasuri | drunk | grey_kimono | obi | holding | sitting | sash | wide_sleeves | holding_bottle | tokkuri | long_sleeves | snow | teeth | holding_fan | uchiwa | naked_towel | onsen | partially_submerged | water | white_towel | blue_eyes | outdoors | wet | nipples | 1boy | hetero | penis | solo_focus | bar_censor | naked_kimono | spread_legs | weasel_girl | yukata | after_sex | after_vaginal | cum_in_pussy | cumdrip | sweat | testicles | erection | girl_on_top | mosaic_censoring | small_breasts | fur_trim | grey_gloves | grey_headwear | hairclip | hat | jacket | pink_eyes | capelet | flask | multicolored_eyes | grey_jacket | grey_coat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-----------|:-------------|:--------------------|:-----------------------------|:--------------------|:-------|:-------------------|:--------|:--------|:-------------|:-----------------|:---------------|:---------------|:-------------|:--------------|:-----------|:--------|:--------------|:------|:----------|:----------|:-------|:---------------|:-----------------|:----------|:---------------|:-------|:--------|:--------------|:---------|:--------------|:--------|:----------------------|:--------|:--------------|:------------|:-----------|:------|:----------|:-------|:---------|:--------|:-------------|:-------------|:---------------|:--------------|:--------------|:---------|:------------|:----------------|:---------------|:----------|:--------|:------------|:-----------|:--------------|:-------------------|:----------------|:-----------|:--------------|:----------------|:-----------|:------|:---------|:------------|:----------|:--------|:--------------------|:--------------|:------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | X | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | | X | X | | X | | X | X | | | | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | | | | X | | X | | | X | | | | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | X | X | | | X | | X | X | X | X | | | X | X | | | | | | X | | | | X | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | | | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | X | | X | X | X | X | X | X | | | | X | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | |
| 8 | 5 |  |  |  |  |  | X | | | | X | | X | X | X | X | X | X | | X | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | X | | X | X |
|
sanchit-gandhi/librispeech_asr_dummy | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- audio-classification
task_ids:
- speaker-identification
paperswithcode_id: librispeech-1
pretty_name: LibriSpeech Dummy
configs:
- config_name: default
data_files:
- split: test.other
path: data/test.other-*
- split: train.other.500
path: data/train.other.500-*
- split: train.clean.360
path: data/train.clean.360-*
- split: validation.clean
path: data/validation.clean-*
- split: test.clean
path: data/test.clean-*
- split: validation.other
path: data/validation.other-*
- split: train.clean.100
path: data/train.clean.100-*
- config_name: short-form
data_files:
- split: validation
path: short-form/validation-*
dataset_info:
config_name: short-form
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: validation
num_bytes: 9677021.0
num_examples: 73
download_size: 9192059
dataset_size: 9677021.0
---
# Dataset Card for librispeech_asr_dummy
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [LibriSpeech ASR corpus](http://www.openslr.org/12)
- **Repository:** [Needs More Information]
- **Paper:** [LibriSpeech: An ASR Corpus Based On Public Domain Audio Books](https://www.danielpovey.com/files/2015_icassp_librispeech.pdf)
- **Leaderboard:** [The 🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
- **Point of Contact:** [Daniel Povey](mailto:dpovey@gmail.com)
### Dataset Summary
This is a **truncated** version of the LibriSpeech dataset. It contains 20 samples from each of the splits. To view the full dataset, visit: https://huggingface.co/datasets/librispeech_asr
LibriSpeech is a corpus of approximately 1000 hours of 16kHz read English speech, prepared by Vassil Panayotov with the assistance of Daniel Povey. The data is derived from read audiobooks from the LibriVox project, and has been carefully segmented and aligned.
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`, `audio-speaker-identification`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active Hugging Face leaderboard which can be found at https://huggingface.co/spaces/huggingface/hf-speech-bench. The leaderboard ranks models uploaded to the Hub based on their WER. An external leaderboard at https://paperswithcode.com/sota/speech-recognition-on-librispeech-test-clean ranks the latest models from research and academia.
### Languages
The audio is in English. There are two configurations: `clean` and `other`.
The speakers in the corpus were ranked according to the WER of the transcripts of a model trained on
a different dataset, and were divided roughly in the middle,
with the lower-WER speakers designated as "clean" and the higher WER speakers designated as "other".
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file, usually called `file` and its transcription, called `text`. Some additional information about the speaker and the passage which contains the transcription is provided.
```
{'chapter_id': 141231,
'file': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'audio': {'path': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346,
0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000},
'id': '1272-141231-0000',
'speaker_id': 1272,
'text': 'A MAN SAID TO THE UNIVERSE SIR I EXIST'}
```
### Data Fields
- file: A path to the downloaded audio file in .flac format.
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- id: unique id of the data sample.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
- chapter_id: id of the audiobook chapter which includes the transcription.
### Data Splits
The size of the corpus makes it impractical, or at least inconvenient
for some users, to distribute it as a single large archive. Thus the
training portion of the corpus is split into three subsets, with approximate size 100, 360 and 500 hours respectively.
A simple automatic
procedure was used to select the audio in the first two sets to be, on
average, of higher recording quality and with accents closer to US
English. An acoustic model was trained on WSJ’s si-84 data subset
and was used to recognize the audio in the corpus, using a bigram
LM estimated on the text of the respective books. We computed the
Word Error Rate (WER) of this automatic transcript relative to our
reference transcripts obtained from the book texts.
The speakers in the corpus were ranked according to the WER of
the WSJ model’s transcripts, and were divided roughly in the middle,
with the lower-WER speakers designated as "clean" and the higher-WER speakers designated as "other".
For "clean", the data is split into train, validation, and test set. The train set is further split into train.100 and train.360
respectively accounting for 100h and 360h of the training data.
For "other", the data is split into train, validation, and test set. The train set contains approximately 500h of recorded speech.
| | Train.500 | Train.360 | Train.100 | Valid | Test |
| ----- | ------ | ----- | ---- | ---- | ---- |
| clean | - | 104014 | 28539 | 2703 | 2620|
| other | 148688 | - | - | 2864 | 2939 |
## Dataset Creation
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Additional Information
### Dataset Curators
The dataset was initially created by Vassil Panayotov, Guoguo Chen, Daniel Povey, and Sanjeev Khudanpur.
### Licensing Information
[CC BY 4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@inproceedings{panayotov2015librispeech,
title={Librispeech: an ASR corpus based on public domain audio books},
author={Panayotov, Vassil and Chen, Guoguo and Povey, Daniel and Khudanpur, Sanjeev},
booktitle={Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on},
pages={5206--5210},
year={2015},
organization={IEEE}
}
```
|
iashchak/igor_link_dialogues_rendered | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
struct:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 29950800
num_examples: 31516
download_size: 15605747
dataset_size: 29950800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "igor_link_dialogues_rendered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
W1lson/SystemRequirements | ---
license: openrail
---
|
liuyanchen1015/VALUE_wikitext2_negative_inversion | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 647
num_examples: 1
- name: train
num_bytes: 1026
num_examples: 1
- name: validation
num_bytes: 631
num_examples: 1
download_size: 18138
dataset_size: 2304
---
# Dataset Card for "VALUE_wikitext2_negative_inversion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SYSPIN/LIMMITS24_target_speaker_fewshot_samples | ---
license: cc-by-4.0
---
Few shot and reference files for LIMMITS 24 Challenge - https://sites.google.com/view/limmits24/
IndicTTS speakers are taken from https://www.iitm.ac.in/donlab/tts/detailed_statistics.php |
jondurbin/airoboros-gpt4-m2.0 | ---
license: other
---
## Overview
This is a merge of https://hf.co/datasets/jondurbin/airoboros-gpt4-1.4.1 and https://hf.co/datasets/jondurbin/airoboros-gpt4-2.0
### Category breakdown

### Licence and usage restrictions
The data was generated by gpt-4 via OpenAI API calls.
The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI
- what does *compete* actually mean here?
- these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place
- if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works
- the training data used in essentially all large language models includes a significant of copyrighted or otherwise unallowable licensing in the first place
- other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly.
Your best bet is probably to avoid using this commercially due to the OpenAI API usage.
Either way, by using this model, you agree to completely idemnify me from any and all license related issues.
Attribution would be nice if you use some or all of the data. |
DBQ/Mr.Porter.Product.prices.United.Arab.Emirates | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: United Arab Emirates - Mr Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Mr Porter
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 8881377
num_examples: 27033
download_size: 2108304
dataset_size: 8881377
---
# Mr Porter web scraped data
## About the website
The **E-commerce industry** within the **EMEA** region, particularly in the **United Arab Emirates (UAE)**, has witnessed substantial growth and continues to expand. Internet penetration and the fast-paced life in the UAE have boosted the growth of the e-commerce industry. The dataset observed provides specific insights about **Ecommerce product-list page (PLP) data** on **Mr Porter in the United Arab Emirates**. This information is essential to understand purchasing patterns and customer preferences, vital elements in developing effective e-commerce strategies within the industry. Additionally, it helps in tracking the performance of individual products, enabling Mr Porter to stay ahead in the competitive industry.
## Link to **dataset**
[United Arab Emirates - Mr Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Mr%20Porter%20Product-prices%20United%20Arab%20Emirates/r/recHILSDUDBd0BMJ4)
|
Dmenorsz/MCDALESTE | ---
license: openrail
---
|
FINNUMBER/FINCH_TRAIN_FPB | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3629471
num_examples: 2883
download_size: 1440661
dataset_size: 3629471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kardosdrur/folketing-wiki-clean | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1046056.8
num_examples: 3600
- name: test
num_bytes: 261514.2
num_examples: 900
download_size: 760114
dataset_size: 1307571.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mmathys/profanity | ---
license: mit
---
# The Obscenity List
*by [Surge AI, the world's most powerful NLP data labeling platform and workforce](https://www.surgehq.ai)*
Ever wish you had a ready-made list of profanity? Maybe you want to remove NSFW comments, filter offensive usernames, or build content moderation tools, and you can't dream up enough obscenities on your own.
At Surge AI, we help companies build human-powered datasets to train stunning AI and NLP, and we're creating the world's largest profanity list in 20+ languages.
## Dataset
This repo contains 1600+ popular English profanities and their variations.
**Columns**
* `text`: the profanity
* `canonical_form_1`: the profanity's canonical form
* `canonical_form_2`: an additional canonical form, if applicable
* `canonical_form_3`: an additional canonical form, if applicable
* `category_1`: the profanity's primary category (see below for list of categories)
* `category_2`: the profanity's secondary category, if applicable
* `category_3`: the profanity's tertiary category, if applicable
* `severity_rating`: We asked 5 [Surge AI](https://www.surgehq.ai) data labelers to rate how severe they believed each profanity to be, on a 1-3 point scale. This is the mean of those 5 ratings.
* `severity_description`: We rounded `severity_rating` to the nearest integer. `Mild` corresponds to a rounded mean rating of `1`, `Strong` to `2`, and `Severe` to `3`.
## Categories
We organized the profanity into the following categories:
- sexual anatomy / sexual acts (ass kisser, dick, pigfucker)
- bodily fluids / excrement (shit, cum)
- sexual orientation / gender (faggot, tranny, bitch, whore)
- racial / ethnic (chink, n3gro)
- mental disability (retard, dumbass)
- physical disability (quadriplegic bitch)
- physical attributes (fatass, ugly whore)
- animal references (pigfucker, jackass)
- religious offense (goddamn)
- political (China virus)
## Future
We'll be adding more languages and profanity annotations (e.g., augmenting each profanity with its severity level, type, and other variations) over time.
Check out our other [free datasets](https://www.surgehq.ai/datasets).
Sign up [here](https://forms.gle/u1SKL4zySK2wMp1r7) to receive updates on this dataset and be the first to learn about new datasets we release!
## Contact
Need a larger set of expletives and slurs, or a list of swear words in other languages (Spanish, French, German, Japanese, Portuguese, etc)? We work with top AI and content moderation companies around the world, and we love feedback. Post an issue or reach out to team@surgehq.ai!

Follow us on Twitter at [@HelloSurgeAI](https://www.twitter.com/@HelloSurgeAI).
## Original Repo
You can find the original repository here: https://github.com/surge-ai/profanity/ |
mnist | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-nist
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
paperswithcode_id: mnist
pretty_name: MNIST
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
config_name: mnist
splits:
- name: train
num_bytes: 17470848
num_examples: 60000
- name: test
num_bytes: 2916440
num_examples: 10000
download_size: 11594722
dataset_size: 20387288
---
# Dataset Card for MNIST
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://yann.lecun.com/exdb/mnist/
- **Repository:**
- **Paper:** MNIST handwritten digit database by Yann LeCun, Corinna Cortes, and CJ Burges
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The MNIST dataset consists of 70,000 28x28 black-and-white images of handwritten digits extracted from two NIST databases. There are 60,000 images in the training dataset and 10,000 images in the validation dataset, one class per digit so a total of 10 classes, with 7,000 images (6,000 train images and 1,000 test images) per class.
Half of the image were drawn by Census Bureau employees and the other half by high school students (this split is evenly distributed in the training and testing sets).
### Supported Tasks and Leaderboards
- `image-classification`: The goal of this task is to classify a given image of a handwritten digit into one of 10 classes representing integer values from 0 to 9, inclusively. The leaderboard is available [here](https://paperswithcode.com/sota/image-classification-on-mnist).
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its label:
```
{
'image': <PIL.PngImagePlugin.PngImageFile image mode=L size=28x28 at 0x276021F6DD8>,
'label': 5
}
```
### Data Fields
- `image`: A `PIL.Image.Image` object containing the 28x28 image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `label`: an integer between 0 and 9 representing the digit.
### Data Splits
The data is split into training and test set. All the images in the test set were drawn by different individuals than the images in the training set. The training set contains 60,000 images and the test set 10,000 images.
## Dataset Creation
### Curation Rationale
The MNIST database was created to provide a testbed for people wanting to try pattern recognition methods or machine learning algorithms while spending minimal efforts on preprocessing and formatting. Images of the original dataset (NIST) were in two groups, one consisting of images drawn by Census Bureau employees and one consisting of images drawn by high school students. In NIST, the training set was built by grouping all the images of the Census Bureau employees, and the test set was built by grouping the images form the high school students.
The goal in building MNIST was to have a training and test set following the same distributions, so the training set contains 30,000 images drawn by Census Bureau employees and 30,000 images drawn by high school students, and the test set contains 5,000 images of each group. The curators took care to make sure all the images in the test set were drawn by different individuals than the images in the training set.
### Source Data
#### Initial Data Collection and Normalization
The original images from NIST were size normalized to fit a 20x20 pixel box while preserving their aspect ratio. The resulting images contain grey levels (i.e., pixels don't simply have a value of black and white, but a level of greyness from 0 to 255) as a result of the anti-aliasing technique used by the normalization algorithm. The images were then centered in a 28x28 image by computing the center of mass of the pixels, and translating the image so as to position this point at the center of the 28x28 field.
#### Who are the source language producers?
Half of the source images were drawn by Census Bureau employees, half by high school students. According to the dataset curator, the images from the first group are more easily recognizable.
### Annotations
#### Annotation process
The images were not annotated after their creation: the image creators annotated their images with the corresponding label after drawing them.
#### Who are the annotators?
Same as the source data creators.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Chris Burges, Corinna Cortes and Yann LeCun
### Licensing Information
MIT Licence
### Citation Information
```
@article{lecun2010mnist,
title={MNIST handwritten digit database},
author={LeCun, Yann and Cortes, Corinna and Burges, CJ},
journal={ATT Labs [Online]. Available: http://yann.lecun.com/exdb/mnist},
volume={2},
year={2010}
}
```
### Contributions
Thanks to [@sgugger](https://github.com/sgugger) for adding this dataset. |
thobauma/harmless-poisoned-0.05-BeHarmfulNow-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/trec-robust04_fold3 | ---
pretty_name: '`trec-robust04/fold3`'
viewer: false
source_datasets: ['irds/trec-robust04']
task_categories:
- text-retrieval
---
# Dataset Card for `trec-robust04/fold3`
The `trec-robust04/fold3` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/trec-robust04#trec-robust04/fold3).
# Data
This dataset provides:
- `queries` (i.e., topics); count=50
- `qrels`: (relevance assessments); count=62,901
- For `docs`, use [`irds/trec-robust04`](https://huggingface.co/datasets/irds/trec-robust04)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/trec-robust04_fold3', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/trec-robust04_fold3', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Voorhees2004Robust,
title={Overview of the TREC 2004 Robust Retrieval Track},
author={Ellen Voorhees},
booktitle={TREC},
year={2004}
}
@inproceedings{Huston2014ACO,
title={A Comparison of Retrieval Models using Term Dependencies},
author={Samuel Huston and W. Bruce Croft},
booktitle={CIKM},
year={2014}
}
```
|
graelo/cancre | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
pretty_name: Cancre (French Grammatical Errors)
paperswithcode_id: null
license:
- cc-by-sa-3.0
task_categories:
- text-generation
- text-classification
task_ids:
- language-modeling
source_datasets:
- original
multilinguality:
- monolingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
language:
- fr
dataset_info:
features:
- name: phrase1
dtype: string
- name: phrase2
dtype: string
- name: explication
dtype: string
splits:
- name: train
num_bytes: 1934861
num_examples: 10000
- name: test
num_bytes: 327827
num_examples: 1681
download_size: 2866947
dataset_size: 2262688
---
# French Grammatical Errors
This dataset contains pairs of sentences and an explanation:
- "phrase1" is a french sentence containing a grammatical error
- "phrase2" is the same sentence without any error (please reach out if you think
an error is present -- I could not see any)
- "explication" is some text explaining the grammatical error
## Release Notes
`0.1.0`
- No error category is present, you would have to infer it from the `explication` column
|
Megnis/python_code_instructions_18k_LlaMa2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 11576037
num_examples: 18612
download_size: 5536608
dataset_size: 11576037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wisesight1000 | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- th
license:
- cc0-1.0
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- extended|wisesight_sentiment
task_categories:
- token-classification
task_ids: []
pretty_name: wisesight1000
tags:
- word-tokenization
dataset_info:
features:
- name: char
sequence: string
- name: char_type
sequence:
class_label:
names:
'0': b_e
'1': c
'2': d
'3': n
'4': o
'5': p
'6': q
'7': s
'8': s_e
'9': t
'10': v
'11': w
- name: is_beginning
sequence:
class_label:
names:
'0': neg
'1': pos
config_name: wisesight1000
splits:
- name: train
num_bytes: 1735438
num_examples: 993
download_size: 222691
dataset_size: 1735438
---
# Dataset Card for `wisesight1000`
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/PyThaiNLP/wisesight-sentiment
- **Repository:** https://github.com/PyThaiNLP/wisesight-sentiment/blob/master/word-tokenization/
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** https://github.com/PyThaiNLP/
### Dataset Summary
`wisesight1000` contains Thai social media texts randomly drawn from the full `wisesight-sentiment`, tokenized by human annotators.
Out of the labels `neg` (negative), `neu` (neutral), `pos` (positive), `q` (question), 250 samples each. Some texts are removed because they look like spam. Because these samples are representative of real world content, we believe having these annotaed samples will allow the community to robustly evaluate tokenization algorithms.
### Supported Tasks and Leaderboards
word tokenization
### Languages
Thai
## Dataset Structure
### Data Instances
```
{'char': ['E', 'u', 'c', 'e', 'r', 'i', 'n', ' ', 'p', 'r', 'o', ' ', 'a', 'c', 'n', 'e', ' ', 'ค', '่', 'ะ', ' ', 'ใ', 'ช', '้', 'แ', 'ล', '้', 'ว', 'ส', 'ิ', 'ว', 'ข', 'ึ', '้', 'น', 'เ', 'พ', 'ิ', '่', 'ม', 'ท', 'ุ', 'ก', 'ว', 'ั', 'น', ' ', 'ม', 'า', 'ด', 'ู', 'ก', 'ั', 'น', 'น', 'ะ', 'ค', 'ะ', ' ', 'ว', '่', 'า', 'จ', 'ั', 'ด', 'ก', 'า', 'ร', 'ป', 'ั', 'ญ', 'ห', 'า', 'ส', 'ิ', 'ว', 'ใ', 'น', '7', 'ว', 'ั', 'น', 'ไ', 'ด', '้', 'ร', 'ึ', 'ม', 'ั', '่', 'ย', 'ย', 'ย', 'ย', 'ย', 'ย', 'ย', 'ย', ' ', 'ล', '่', 'า', 'ส', 'ุ', 'ด', 'ไ', 'ป', 'ล', '้', 'า', 'ง', 'ห', 'น', '้', '…', '\n'], 'char_type': [0, 8, 8, 8, 8, 8, 8, 5, 8, 8, 8, 5, 8, 8, 8, 8, 5, 1, 9, 10, 5, 11, 1, 9, 11, 1, 9, 1, 1, 10, 1, 1, 10, 9, 1, 11, 1, 10, 9, 1, 1, 10, 1, 1, 4, 1, 5, 1, 10, 1, 10, 1, 4, 1, 1, 10, 1, 10, 5, 1, 9, 10, 1, 4, 1, 1, 10, 1, 1, 4, 1, 3, 10, 1, 10, 1, 11, 1, 2, 1, 4, 1, 11, 1, 9, 1, 10, 1, 4, 9, 1, 1, 1, 1, 1, 1, 1, 1, 5, 1, 9, 10, 1, 10, 1, 11, 1, 1, 9, 10, 1, 3, 1, 9, 4, 4], 'is_beginning': [1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 1, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0]}
{'char': ['แ', 'พ', 'ง', 'เ', 'ว', '่', 'อ', 'ร', '์', ' ', 'เ', 'บ', 'ี', 'ย', 'ร', '์', 'ช', '้', 'า', 'ง', 'ต', '้', 'น', 'ท', 'ุ', 'น', 'ข', 'ว', 'ด', 'ล', 'ะ', 'ไ', 'ม', '่', 'ถ', 'ึ', 'ง', ' ', '5', '0', ' ', 'ข', 'า', 'ย', ' ', '1', '2', '0', ' ', '😰', '😰', '😰', '์', '\n'], 'char_type': [11, 1, 1, 11, 1, 9, 1, 1, 7, 5, 11, 1, 10, 1, 1, 7, 1, 9, 10, 1, 1, 9, 1, 1, 10, 1, 1, 1, 1, 1, 10, 11, 1, 9, 1, 10, 1, 5, 2, 2, 5, 1, 10, 1, 5, 2, 2, 2, 5, 4, 4, 4, 7, 4], 'is_beginning': [1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 1, 0, 0, 1, 1, 1, 1, 0, 0]}
```
### Data Fields
- `char`: characters
- `char_type`: character types as adopted from []() by [deepcut](https://github.com/rkcosmos/deepcut)
- `is_beginning`: 1 if beginning of word else 0
### Data Splits
No explicit split is given.
## Dataset Creation
### Curation Rationale
The dataset was created from `wisesight-sentiment` to be a word tokenization benchmark that is closer to texts in the wild, since other Thai word tokenization datasets such as [BEST](https://aiforthai.in.th/corpus.php) are mostly texts from news articles, which do not have some real-world features like misspellings.
### Source Data
#### Initial Data Collection and Normalization
The data are sampled from `wisesight-sentiment` which has the following data collection and normalization:
- Style: Informal and conversational. With some news headlines and advertisement.
- Time period: Around 2016 to early 2019. With small amount from other period.
- Domains: Mixed. Majority are consumer products and services (restaurants, cosmetics, drinks, car, hotels), with some current affairs.
- Privacy:
- Only messages that made available to the public on the internet (websites, blogs, social network sites).
- For Facebook, this means the public comments (everyone can see) that made on a public page.
- Private/protected messages and messages in groups, chat, and inbox are not included.
- Usernames and non-public figure names are removed
- Phone numbers are masked (e.g. 088-888-8888, 09-9999-9999, 0-2222-2222)
- If you see any personal data still remain in the set, please tell us - so we can remove them.
- Alternations and modifications:
- Keep in mind that this corpus does not statistically represent anything in the language register.
- Large amount of messages are not in their original form. Personal data are removed or masked.
- Duplicated, leading, and trailing whitespaces are removed. Other punctuations, symbols, and emojis are kept intact.
- (Mis)spellings are kept intact.
- Messages longer than 2,000 characters are removed.
- Long non-Thai messages are removed. Duplicated message (exact match) are removed.
#### Who are the source language producers?
Social media users in Thailand
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
The annotation was done by several people, including Nitchakarn Chantarapratin, [Pattarawat Chormai](https://github.com/heytitle), [Ponrawee Prasertsom](https://github.com/ponrawee), [Jitkapat Sawatphol](https://github.com/jitkapat), [Nozomi Yamada](https://github.com/nozomiyamada), and [Attapol Rutherford](https://attapol.github.io/).
### Personal and Sensitive Information
- The authors tried to exclude any known personally identifiable information from this data set.
- Usernames and non-public figure names are removed
- Phone numbers are masked (e.g. 088-888-8888, 09-9999-9999, 0-2222-2222)
- If you see any personal data still remain in the set, please tell us - so we can remove them.
## Considerations for Using the Data
### Social Impact of Dataset
- word tokenization dataset from texts in the wild
### Discussion of Biases
- no guideline is given by the authors on word tokenization
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Thanks [PyThaiNLP](https://github.com/PyThaiNLP/pythainlp) community, [Kitsuchart Pasupa](http://www.it.kmitl.ac.th/~kitsuchart/) (Faculty of Information Technology, King Mongkut's Institute of Technology Ladkrabang), and [Ekapol Chuangsuwanich](https://www.cp.eng.chula.ac.th/en/about/faculty/ekapolc/) (Faculty of Engineering, Chulalongkorn University) for advice. The original Kaggle competition, using the first version of this corpus, can be found at https://www.kaggle.com/c/wisesight-sentiment/
### Licensing Information
CC0
### Citation Information
Dataset:
```
@software{bact_2019_3457447,
author = {Suriyawongkul, Arthit and
Chuangsuwanich, Ekapol and
Chormai, Pattarawat and
Polpanumas, Charin},
title = {PyThaiNLP/wisesight-sentiment: First release},
month = sep,
year = 2019,
publisher = {Zenodo},
version = {v1.0},
doi = {10.5281/zenodo.3457447},
url = {https://doi.org/10.5281/zenodo.3457447}
}
```
Character type features:
```
@inproceedings{haruechaiyasak2009tlex,
title={TLex: Thai lexeme analyser based on the conditional random fields},
author={Haruechaiyasak, Choochart and Kongyoung, Sarawoot},
booktitle={Proceedings of 8th International Symposium on Natural Language Processing},
year={2009}
}
```
### Contributions
Thanks to [@cstorm125](https://github.com/cstorm125) for adding this dataset. |
zhangshuoming/c_llvm_O0_exebench_json_cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 515356461
num_examples: 566749
download_size: 154524123
dataset_size: 515356461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c_llvm_O0_exebench_json_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_HWERI__Llama2-7b-sharegpt4 | ---
pretty_name: Evaluation run of HWERI/Llama2-7b-sharegpt4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HWERI/Llama2-7b-sharegpt4](https://huggingface.co/HWERI/Llama2-7b-sharegpt4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HWERI__Llama2-7b-sharegpt4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T10:08:23.331981](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__Llama2-7b-sharegpt4/blob/main/results_2023-10-26T10-08-23.331981.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.0004058451132417743,\n \"f1\": 0.06141988255033573,\n\
\ \"f1_stderr\": 0.0014263478827371335,\n \"acc\": 0.369226585159047,\n\
\ \"acc_stderr\": 0.008577465355756637\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417743,\n\
\ \"f1\": 0.06141988255033573,\n \"f1_stderr\": 0.0014263478827371335\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \
\ \"acc_stderr\": 0.00442704598726516\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7119179163378059,\n \"acc_stderr\": 0.012727884724248115\n\
\ }\n}\n```"
repo_url: https://huggingface.co/HWERI/Llama2-7b-sharegpt4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T10_08_23.331981
path:
- '**/details_harness|drop|3_2023-10-26T10-08-23.331981.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T10-08-23.331981.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T10_08_23.331981
path:
- '**/details_harness|gsm8k|5_2023-10-26T10-08-23.331981.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T10-08-23.331981.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T10_08_23.331981
path:
- '**/details_harness|winogrande|5_2023-10-26T10-08-23.331981.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T10-08-23.331981.parquet'
- config_name: results
data_files:
- split: 2023_10_26T10_08_23.331981
path:
- results_2023-10-26T10-08-23.331981.parquet
- split: latest
path:
- results_2023-10-26T10-08-23.331981.parquet
---
# Dataset Card for Evaluation run of HWERI/Llama2-7b-sharegpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HWERI/Llama2-7b-sharegpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HWERI/Llama2-7b-sharegpt4](https://huggingface.co/HWERI/Llama2-7b-sharegpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HWERI__Llama2-7b-sharegpt4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T10:08:23.331981](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__Llama2-7b-sharegpt4/blob/main/results_2023-10-26T10-08-23.331981.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417743,
"f1": 0.06141988255033573,
"f1_stderr": 0.0014263478827371335,
"acc": 0.369226585159047,
"acc_stderr": 0.008577465355756637
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417743,
"f1": 0.06141988255033573,
"f1_stderr": 0.0014263478827371335
},
"harness|gsm8k|5": {
"acc": 0.026535253980288095,
"acc_stderr": 0.00442704598726516
},
"harness|winogrande|5": {
"acc": 0.7119179163378059,
"acc_stderr": 0.012727884724248115
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zkdeng/dangerousSpiders | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Acantholycosa_lignaria
'1': Aglaoctenus_castaneus
'2': Aglaoctenus_lagotis
'3': Allocosa_funerea
'4': Allotrochosina_schauinslandi
'5': Alopecosa_albofasciata
'6': Alopecosa_barbipes
'7': Alopecosa_cuneata
'8': Alopecosa_inquilina
'9': Alopecosa_kochi
'10': Alopecosa_pulverulenta
'11': Anahita_punctulata
'12': Ancylometes_bogotensis
'13': Ancylometes_concolor
'14': Ancylometes_rufus
'15': Anoteropsis_hilaris
'16': Anoteropsis_litoralis
'17': Araneus_diadematus
'18': Arctosa_cinerea
'19': Arctosa_leopardus
'20': Arctosa_littoralis
'21': Arctosa_perita
'22': Arctosa_personata
'23': Asthenoctenus_borellii
'24': Aulonia_albimana
'25': Centroctenus_brevipes
'26': Cheiracanthium_erraticum
'27': Cheiracanthium_gracile
'28': Cheiracanthium_inclusum
'29': Cheiracanthium_mildei
'30': Cheiracanthium_punctorium
'31': Ctenus_amphora
'32': Ctenus_hibernalis
'33': Ctenus_medius
'34': Ctenus_ornatus
'35': Cupiennius_coccineus
'36': Cupiennius_getazi
'37': Cupiennius_salei
'38': Diapontia_uruguayensis
'39': Eratigena_agrestis
'40': Geolycosa_vultuosa
'41': Gladicosa_gulosa
'42': Gladicosa_pulchra
'43': Hippasa_holmerae
'44': Hogna_antelucana
'45': Hogna_baltimoriana
'46': Hogna_bivittata
'47': Hogna_carolinensis
'48': Hogna_crispipes
'49': Hogna_frondicola
'50': Hogna_gumia
'51': Hogna_radiata
'52': Lampona_cylindrata
'53': Latrodectus_bishopi
'54': Latrodectus_curacaviensis
'55': Latrodectus_geometricus
'56': Latrodectus_hasselti
'57': Latrodectus_hesperus
'58': Latrodectus_katipo
'59': Latrodectus_mactans
'60': Latrodectus_mirabilis
'61': Latrodectus_renivulvatus
'62': Latrodectus_tredecimguttatus
'63': Latrodectus_variolus
'64': Loxosceles_amazonica
'65': Loxosceles_deserta
'66': Loxosceles_laeta
'67': Loxosceles_reclusa
'68': Loxosceles_rufescens
'69': Loxosceles_tenochtitlan
'70': Loxosceles_yucatana
'71': Lycosa_erythrognatha
'72': Lycosa_hispanica
'73': Lycosa_pampeana
'74': Lycosa_praegrandis
'75': Lycosa_singoriensis
'76': Lycosa_tarantula
'77': Missulena_bradleyi
'78': Missulena_occatoria
'79': Paratrochosina_amica
'80': Pardosa_amentata
'81': Pardosa_lapidicina
'82': Pardosa_mercurialis
'83': Pardosa_moesta
'84': Pardosa_wagleri
'85': Phoneutria_boliviensis
'86': Phoneutria_depilata
'87': Phoneutria_fera
'88': Phoneutria_nigriventer
'89': Phoneutria_pertyi
'90': Phoneutria_reidyi
'91': Pirata_piraticus
'92': Portacosa_cinerea
'93': Rabidosa_hentzi
'94': Rabidosa_punctulata
'95': Rabidosa_rabida
'96': Schizocosa_avida
'97': Schizocosa_malitiosa
'98': Schizocosa_mccooki
'99': Sicarius_thomisoides
'100': Sosippus_californicus
'101': Tigrosa_annexa
'102': Tigrosa_aspersa
'103': Tigrosa_georgicola
'104': Tigrosa_helluo
'105': Trochosa_ruricola
'106': Trochosa_sepulchralis
'107': Trochosa_terricola
'108': Tropicosa_moesta
'109': Venator_immansuetus
'110': Venator_spenceri
'111': Venatrix_furcillata
'112': Wadicosa_fidelis
'113': Xerolycosa_miniata
'114': Xerolycosa_nemoralis
splits:
- name: train
num_bytes: 4290587998.03
num_examples: 166895
download_size: 3551438155
dataset_size: 4290587998.03
---
# Dataset Card for "dangerousSpiders"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cmu-mlsp/encodec_24khz-opt-125m-lm_pretraining_ls960_1qt-librispeech_asr-train.clean.100-features | ---
dataset_info:
features:
- name: file
sequence: string
- name: text
sequence: string
- name: speaker_id
sequence: int64
- name: chapter_id
sequence: int64
- name: id
sequence: string
- name: audio_codes
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 759983
num_examples: 10
download_size: 114897
dataset_size: 759983
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "encodec_24khz-opt-125m-lm_pretraining_ls960_1qt-librispeech_asr-train.clean.100-features"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jth500/T5_sft | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: decoder_input_ids
sequence: int64
- name: decoder_attention_mask
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 628899.0
num_examples: 108
download_size: 194310
dataset_size: 628899.0
---
# Dataset Card for "T5_sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FastFit/claim_stance_55 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 187302
num_examples: 1675
- name: test
num_bytes: 53256
num_examples: 480
download_size: 122573
dataset_size: 240558
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_25_10000000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 191229
num_examples: 6699
download_size: 122240
dataset_size: 191229
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_25_10000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/5e4a199e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 247
num_examples: 10
download_size: 1418
dataset_size: 247
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "5e4a199e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamjweintraut/bart-finetuned-eli5_precomputed_best_slice-512_2023-12-10_run | ---
dataset_info:
features:
- name: q_id
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: predicted
dtype: string
- name: label
dtype: string
- name: rougeL_min_precision
dtype: float64
- name: rougeL_min_recall
dtype: float64
- name: rougeL_min_fmeasure
dtype: float64
- name: rougeL_median_precision
dtype: float64
- name: rougeL_median_recall
dtype: float64
- name: rougeL_median_fmeasure
dtype: float64
- name: rougeL_max_precision
dtype: float64
- name: rougeL_max_recall
dtype: float64
- name: rougeL_max_fmeasure
dtype: float64
- name: nli-roberta_label
dtype: string
- name: nli-roberta_plot_vals
dtype: int64
- name: nli-roberta-max-score
dtype: float64
- name: sent_sim
dtype: float32
- name: context_predicted_sim
dtype: float32
- name: context_label_sim
dtype: float32
- name: predicted_label_sim
dtype: float32
splits:
- name: train
num_bytes: 15907389
num_examples: 1250
download_size: 9855757
dataset_size: 15907389
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hlillemark/c4_t5_test | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 263101800
num_examples: 49270
- name: validation
num_bytes: 26283480
num_examples: 4922
download_size: 121664633
dataset_size: 289385280
---
# Dataset Card for "c4_t5_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jfrenz/legalglue | ---
language:
- en
- da
- de
- nl
- sv
- bg
- cs
- hr
- pl
- sk
- sl
- es
- fr
- it
- pt
- ro
- et
- fi
- hu
- lt
- lv
- el
- mt
multilinguality:
- multilingual
source_datasets:
- extended
task_categories:
- text-classification
- token-classification
task_ids:
- named-entity-recognition
- multi-label-classification
- topic-classification
pretty_name: LegalGLUE
tags:
- german-ler
- lener-br
---
# Dataset Card for "LegalGLUE"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://git.rwth-aachen.de/johanna.frenz/legalglue
### Dataset Summary
The "Legal General Language Understanding Evaluation" (LegalGLUE) dataset was created as part of a bachelor thesis.
It consists of four already existing datasets covering three task types and a total of 23 different languages.
### Supported Tasks
<table>
<tr><td>Dataset</td><td>Source</td><td>Task Type</td><td>Languages</td><tr>
<tr><td>German_LER</td><td> <a href="https://arxiv.org/abs/2003.13016">Leitner et al.</a></td><td>Named Entity Recognition</td><td>German</td></tr>
<tr><td>LeNER_Br</td><td> <a href="https://github.com/peluz/lener-br"> de Araujo et al., 2018</a></td><td>Named Entity Recognition</td><td> Portuguese </td></tr>
<tr><td>SwissJudgmentPrediction</td><td> <a href="https://arxiv.org/abs/2110.00806">Niklaus et al.</a> </td><td>Binary Text Classification</td><td>German, French, Italian</td></tr>
<tr><td>MultEURLEX</td><td> <a href="https://arxiv.org/abs/2109.00904">Chalkidis et al. </a> </td><td>Multi-label Text Classification</td><td>23 languages (see below)</td></tr>
</table>
### Languages
see Split section
## Dataset Structure
### Data Instances
#### German_LER
German_LER example
```python
from datasets import load_dataset
dataset = load_dataset('jfrenz/legalglue', 'german_ler')
```
```json
{
'id': '66722',
'tokens':['4.', 'Die', 'Kostenentscheidung', 'für', 'das', 'gerichtliche', 'Antragsverfahren', 'beruht', 'auf', '§', '21', 'Abs.', '2', 'Satz', '1', 'i.', 'V.', 'm.', '§', '20', 'Abs.', '1', 'Satz', '1', 'WBO', '.'],
'ner_tags': [38, 38, 38, 38, 38, 38, 38, 38, 38, 3, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 22, 38]
}
```
#### LeNER-Br
LeNER-Br example
```python
from datasets import load_dataset
dataset = load_dataset('jfrenz/legalglue', 'lener_br')
```
```json
{
'id': '7826',
'tokens': ['Firmado', 'por', 'assinatura', 'digital', '(', 'MP', '2.200-2/2001', ')', 'JOSÉ', 'ROBERTO', 'FREIRE', 'PIMENTA', 'Ministro', 'Relator', 'fls', '.', 'PROCESSO', 'Nº', 'TST-RR-1603-79.2010.5.20.0001'],
'ner_tags': [0, 0, 0, 0, 0, 9, 10, 0, 3, 4, 4, 4, 0, 0, 0, 0, 11, 12, 12]}
```
#### SwissJudgmentPrediction
swissJudgmentPrediction_de example
```python
from datasets import load_dataset
dataset = load_dataset('jfrenz/legalglue', 'swissJudgmentPrediction_de')
```
```json
{
'id': 48755,
'year': 2014,
'text': "Sachverhalt: A. X._ fuhr am 25. Juli 2012 bei Mülligen mit seinem Personenwagen auf dem zweiten Überholstreifen der Autobahn A1 in Richtung Zürich. Gemäss Anklage schloss er auf einen Lieferwagen auf und schwenkte vom zweiten auf den ersten Überholstreifen aus. Danach fuhr er an zwei Fahrzeugen rechts vorbei und wechselte auf die zweite Überholspur zurück. B. Das Obergericht des Kantons Aargau erklärte X._ am 14. Januar 2014 zweitinstanzlich der groben Verletzung der Verkehrsregeln schuldig. Es bestrafte ihn mit einer bedingten Geldstrafe von 30 Tagessätzen zu Fr. 430.-- und einer Busse von Fr. 3'000.--. C. X._ führt Beschwerde in Strafsachen. Er beantragt, er sei von Schuld und Strafe freizusprechen. Eventualiter sei die Sache an die Vorinstanz zurückzuweisen. ",
'label': 0,
'language': 'de',
'region': 'Northwestern Switzerland',
'canton': 'ag',
'legal area': 'penal law'
}
```
#### MultiEURLEX
Monolingual example out of the MultiEURLEX-Dataset
```python
from datasets import load_dataset
dataset = load_dataset('jfrenz/legalglue', 'multi_eurlex_de')
```
```json
{
'celex_id': '32002R0130',
'text': 'Verordnung (EG) Nr. 130/2002 der Kommission\nvom 24. Januar 2002\nbezüglich der im Rahmen der Auss...',
'labels': [3, 17, 5]}
```
Multilingual example out of the MultiEURLEX-Dataset
```python
from datasets import load_dataset
dataset = load_dataset('jfrenz/legalglue', 'multi_eurlex_all_languages')
```
```json
{
'celex_id': '32002R0130',
'text': {
'bg': None,
'cs': None,
'da': 'Kommissionens ...',
'de': 'Verordnung ... ',
'el': '...',
'en': '...',
...
},
'labels': [3, 17, 5]
}
```
### Data Fields
#### German_LER
- `id`: id of the sample
- `tokens`: the tokens of the sample text
- `ner_tags`: the NER tags of each token
#### LeNER_Br
- `id`: id of the sample
- `tokens`: the tokens of the sample text
- `ner_tags`: the NER tags of each token
#### SwissJudgmentPrediction
- `id`: (**int**) ID of the document
- `year`: (**int**) the publication year
- `text`: (**str**) the facts of the case
- `label`: (**class label**) the judgment outcome: 0 (dismissal) or 1 (approval)
- `language`: (**str**) one of (de, fr, it)
- `region`: (**str**) the region of the lower court
- `canton`: (**str**) the canton of the lower court
- `legal area`: (**str**) the legal area of the case
#### MultiEURLEX
Monolingual use:
- `celex_id`: (**str**) Official Document ID of the document
- `text`: (**str**) An EU Law
- `labels`: (**List[int]**) List of relevant EUROVOC concepts (labels)
Multilingual use:
- `celex_id`: (**str**) Official Document ID of the document
- `text`: (dict[**str**]) A dictionary with the 23 languages as keys and the corresponding EU Law as values.
- `labels`: (**List[int]**) List of relevant EUROVOC concepts (labels)
The labels lists consists per default of level 1 EUROVOC concepts. Can be changed by adding the label_level parameter when loading the dataset. (available levels: level_1, level_2, level_3, all_levels)
```python
from datasets import load_dataset
dataset = load_dataset('jfrenz/legalglue', 'multi_eurlex_de', label_level="level_3")
```
### Data Splits
<table>
<tr><th>Dataset</th><th> Language </th> <th> ISO code </th> <th> Number of Documents train/dev/test </th> </tr>
<tr><td>German-LER</td><td>German</td> <td><b>de</b></td> <td> 66723 / - / - </td> </tr>
<tr><td>LeNER-Br</td><td>Portuguese</td> <td><b>pt</b></td> <td> 7828 / 1177 / 1390 </td> </tr>
<tr><td rowspan="3">SwissJudgmentPrediction</td><td>German</td> <td><b>de</b></td> <td> 35458 / 4705 / 9725 </td> </tr>
<tr><td> French </td><td><b>fr</b></td><td> 21179 / 3095 / 6820 </td> </tr>
<tr><td> Italian </td><td><b>it</b></td><td> 3072 / 408 / 812 </td> </tr>
<tr><td rowspan="23">MultiEURLEX</td><td>English </td> <td><b>en</b></td> <td> 55,000 / 5,000 / 5,000 </td> </tr>
<tr><td> German </td> <td> <b>de</b> </td> <td> 55,000 / 5,000 / 5,000 </td> </tr>
<tr><td> French </td> <td> <b>fr</b> </td> <td> 55,000 / 5,000 / 5,000 </td> </tr>
<tr><td> Italian </td> <td> <b>it</b> </td> <td> 55,000 / 5,000 / 5,000 </td> </tr>
<tr><td> Spanish </td> <td> <b>es</b> </td> <td> 52,785 / 5,000 / 5,000 </td> </tr>
<tr><td> Polish </td> <td> <b>pl</b> </td> <td> 23,197 / 5,000 / 5,000 </td> </tr>
<tr><td> Romanian </td> <td> <b>ro</b> </td> <td> 15,921 / 5,000 / 5,000 </td> </tr>
<tr><td> Dutch </td> <td> <b>nl</b> </td> <td> 55,000 / 5,000 / 5,000 </td> </tr>
<tr><td> Greek </td> <td> <b>el</b> </td> <td> 55,000 / 5,000 / 5,000 </td> </tr>
<tr><td> Hungarian </td> <td> <b>hu</b> </td> <td> 22,664 / 5,000 / 5,000 </td> </tr>
<tr><td> Portuguese </td> <td> <b>pt</b> </td> <td> 23,188 / 5,000 / 5,000 </td> </tr>
<tr><td> Czech </td> <td> <b>cs</b> </td> <td> 23,187 / 5,000 / 5,000 </td> </tr>
<tr><td> Swedish </td> <td> <b>sv</b> </td> <td> 42,490 / 5,000 / 5,000 </td> </tr>
<tr><td> Bulgarian </td> <td> <b>bg</b> </td> <td> 15,986 / 5,000 / 5,000 </td> </tr>
<tr><td> Danish </td> <td> <b>da</b> </td> <td> 55,000 / 5,000 / 5,000 </td> </tr>
<tr><td> Finnish </td> <td> <b>fi</b> </td> <td> 42,497 / 5,000 / 5,000 </td> </tr>
<tr><td> Slovak </td> <td> <b>sk</b> </td> <td> 15,986 / 5,000 / 5,000 </td> </tr>
<tr><td> Lithuanian </td> <td> <b>lt</b> </td> <td> 23,188 / 5,000 / 5,000 </td> </tr>
<tr><td> Croatian </td> <td> <b>hr</b> </td> <td> 7,944 / 2,500 / 5,000 </td> </tr>
<tr><td> Slovene </td> <td> <b>sl</b> </td> <td> 23,184 / 5,000 / 5,000 </td> </tr>
<tr><td> Estonian </td> <td> <b>et</b> </td> <td> 23,126 / 5,000 / 5,000 </td> </tr>
<tr><td> Latvian </td> <td> <b>lv</b> </td> <td> 23,188 / 5,000 / 5,000 </td> </tr>
<tr><td> Maltese </td> <td> <b>mt</b> </td> <td> 17,521 / 5,000 / 5,000 </td> </tr>
</table>
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
singhamal1710/crypto_text_corpus | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-dd7fa31c-e9a7-4d4e-81bc-102bff5d38c4-3721 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: autoevaluate/natural-language-inference
metrics: []
dataset_name: glue
dataset_config: mrpc
dataset_split: validation
col_mapping:
text1: sentence1
text2: sentence2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: autoevaluate/natural-language-inference
* Dataset: glue
* Config: mrpc
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
liuyanchen1015/MULTI_VALUE_stsb_analytic_whose_relativizer | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 204
num_examples: 1
- name: test
num_bytes: 118
num_examples: 1
- name: train
num_bytes: 351
num_examples: 1
download_size: 9755
dataset_size: 673
---
# Dataset Card for "MULTI_VALUE_stsb_analytic_whose_relativizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_CM_D_PNP_GENERIC_A_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 12469542
num_examples: 1000
download_size: 2191224
dataset_size: 12469542
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_CM_D_PNP_GENERIC_A_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
IZAY/prueba | ---
license: other
---
|
GreeneryScenery/SheepsCanny | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
- name: conditioning_image_2
dtype: image
splits:
- name: train
num_bytes: 1507768570.06
num_examples: 22719
download_size: 1290896004
dataset_size: 1507768570.06
---
# Dataset Card for "SheepsCanny"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Barowski/wm | ---
license: openrail
---
|
nguyenthanhdo/dolphin_mqa_details_vi | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 28509274
num_examples: 15037
download_size: 12692096
dataset_size: 28509274
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dolphin_mqa_details_vi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BuroIdentidadDigital/Ine_Frontal | ---
license: c-uda
---
|
macrocosm/arxiv_abstracts | ---
license: mit
language:
- en
size_categories:
- 1M<n<10M
---
All 2.3 million papers in the Arxiv, embedded via abstract with the InstructorXL model.
No claims are made about the copyright or license of contained materials. We assume no responsibilty for and are not liable under any circumstances for damages. Use at your own risk.
Good luck, have fun. |
Cristofher/perritos_y_no_perritos | ---
annotations_creators:
- found
language: []
language_creators: []
license:
- apache-2.0
multilinguality: []
pretty_name: 'Perritos-y-no-Perritos'
size_categories:
- n<1K
source_datasets:
- original
tags:
- animals
- dogs
- creature-dataset
task_categories:
- image-classification
task_ids:
- binary-class-image-classification
---
## Dataset Description
TODO
### Dataset Summary
TODO
## Dataset Creatioon
TODO
|
NikiTricky/digital-bg | ---
task_categories:
- text-generation
- summarization
- text-classification
language:
- bg
size_categories:
- 10K<n<100K
configs:
- config_name: default
data_files:
- split: train
path: "posts.json"
---
# Digital.bg articles |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-67000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 649582
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Falah/toy_figure_descriptions | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 242803
num_examples: 1000
download_size: 31441
dataset_size: 242803
---
# Dataset Card for "toy_figure_descriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_50_1713137867 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 226786
num_examples: 568
download_size: 115135
dataset_size: 226786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hoodhahmed/dhivehi_corpus | ---
license: openrail
---
|
thavens/ufb_chosen | ---
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
dataset_info:
features:
- name: messages
list:
- name: condition
dtype: string
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_prefs
num_bytes: 123744575
num_examples: 61135
- name: test_prefs
num_bytes: 4054763
num_examples: 2000
download_size: 69641312
dataset_size: 127799338
---
# Dataset Card for "ufb_chosen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1 | ---
pretty_name: Evaluation run of maywell/TinyLlama-MoE-Chat-0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/TinyLlama-MoE-Chat-0.1](https://huggingface.co/maywell/TinyLlama-MoE-Chat-0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T02:02:06.630482](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1/blob/main/results_2024-01-08T02-02-06.630482.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2969807305972257,\n\
\ \"acc_stderr\": 0.03233735615614679,\n \"acc_norm\": 0.2990966138461531,\n\
\ \"acc_norm_stderr\": 0.03313317327044684,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456411,\n \"mc2\": 0.3781526709576764,\n\
\ \"mc2_stderr\": 0.01431580872082323\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3267918088737201,\n \"acc_stderr\": 0.013706665975587335,\n\
\ \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.01388064457015621\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43397729535949015,\n\
\ \"acc_stderr\": 0.0049460892301530284,\n \"acc_norm\": 0.5672176857199761,\n\
\ \"acc_norm_stderr\": 0.004944485990639527\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n\
\ \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.2709677419354839,\n\
\ \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.36363636363636365,\n \"acc_stderr\": 0.034273086529999365,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.034273086529999365\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.03239637046735703,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.03239637046735703\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.02300062824368795,\n\
\ \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.02300062824368795\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02934457250063435,\n \
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02934457250063435\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26788990825688075,\n\
\ \"acc_stderr\": 0.018987462257978652,\n \"acc_norm\": 0.26788990825688075,\n\
\ \"acc_norm_stderr\": 0.018987462257978652\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802747,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802747\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.03149328104507957,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.03149328104507957\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3459915611814346,\n \"acc_stderr\": 0.030964810588786706,\n \
\ \"acc_norm\": 0.3459915611814346,\n \"acc_norm_stderr\": 0.030964810588786706\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041019,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041019\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.37606837606837606,\n\
\ \"acc_stderr\": 0.03173393632969481,\n \"acc_norm\": 0.37606837606837606,\n\
\ \"acc_norm_stderr\": 0.03173393632969481\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.31800766283524906,\n\
\ \"acc_stderr\": 0.016653486275615404,\n \"acc_norm\": 0.31800766283524906,\n\
\ \"acc_norm_stderr\": 0.016653486275615404\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500114,\n\
\ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961445,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961445\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.35691318327974275,\n\
\ \"acc_stderr\": 0.02721042037593402,\n \"acc_norm\": 0.35691318327974275,\n\
\ \"acc_norm_stderr\": 0.02721042037593402\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3117283950617284,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.3117283950617284,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27640156453715775,\n\
\ \"acc_stderr\": 0.011422153194553577,\n \"acc_norm\": 0.27640156453715775,\n\
\ \"acc_norm_stderr\": 0.011422153194553577\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.272875816993464,\n \"acc_stderr\": 0.01802047414839358,\n \
\ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.01802047414839358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.04653429807913509,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.04653429807913509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407315,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407315\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209194,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209194\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456411,\n \"mc2\": 0.3781526709576764,\n\
\ \"mc2_stderr\": 0.01431580872082323\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5966850828729282,\n \"acc_stderr\": 0.013787257285896245\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \
\ \"acc_stderr\": 0.00410662063774967\n }\n}\n```"
repo_url: https://huggingface.co/maywell/TinyLlama-MoE-Chat-0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|arc:challenge|25_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|arc:challenge|25_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|gsm8k|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|gsm8k|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hellaswag|10_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hellaswag|10_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T00-05-57.757345.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T02-02-06.630482.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- '**/details_harness|winogrande|5_2024-01-08T00-05-57.757345.parquet'
- split: 2024_01_08T02_02_06.630482
path:
- '**/details_harness|winogrande|5_2024-01-08T02-02-06.630482.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T02-02-06.630482.parquet'
- config_name: results
data_files:
- split: 2024_01_08T00_05_57.757345
path:
- results_2024-01-08T00-05-57.757345.parquet
- split: 2024_01_08T02_02_06.630482
path:
- results_2024-01-08T02-02-06.630482.parquet
- split: latest
path:
- results_2024-01-08T02-02-06.630482.parquet
---
# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat-0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/TinyLlama-MoE-Chat-0.1](https://huggingface.co/maywell/TinyLlama-MoE-Chat-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T02:02:06.630482](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1/blob/main/results_2024-01-08T02-02-06.630482.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2969807305972257,
"acc_stderr": 0.03233735615614679,
"acc_norm": 0.2990966138461531,
"acc_norm_stderr": 0.03313317327044684,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456411,
"mc2": 0.3781526709576764,
"mc2_stderr": 0.01431580872082323
},
"harness|arc:challenge|25": {
"acc": 0.3267918088737201,
"acc_stderr": 0.013706665975587335,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.01388064457015621
},
"harness|hellaswag|10": {
"acc": 0.43397729535949015,
"acc_stderr": 0.0049460892301530284,
"acc_norm": 0.5672176857199761,
"acc_norm_stderr": 0.004944485990639527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.034273086529999365,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.034273086529999365
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.03239637046735703,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.03239637046735703
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28974358974358977,
"acc_stderr": 0.02300062824368795,
"acc_norm": 0.28974358974358977,
"acc_norm_stderr": 0.02300062824368795
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02934457250063435,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02934457250063435
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802747,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802747
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.03149328104507957,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.03149328104507957
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3459915611814346,
"acc_stderr": 0.030964810588786706,
"acc_norm": 0.3459915611814346,
"acc_norm_stderr": 0.030964810588786706
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041019,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041019
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.37606837606837606,
"acc_stderr": 0.03173393632969481,
"acc_norm": 0.37606837606837606,
"acc_norm_stderr": 0.03173393632969481
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.31800766283524906,
"acc_stderr": 0.016653486275615404,
"acc_norm": 0.31800766283524906,
"acc_norm_stderr": 0.016653486275615404
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500114,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961445,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961445
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.35691318327974275,
"acc_stderr": 0.02721042037593402,
"acc_norm": 0.35691318327974275,
"acc_norm_stderr": 0.02721042037593402
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3117283950617284,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.3117283950617284,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27640156453715775,
"acc_stderr": 0.011422153194553577,
"acc_norm": 0.27640156453715775,
"acc_norm_stderr": 0.011422153194553577
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.01802047414839358,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.01802047414839358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913509,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407315,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407315
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209194,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209194
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456411,
"mc2": 0.3781526709576764,
"mc2_stderr": 0.01431580872082323
},
"harness|winogrande|5": {
"acc": 0.5966850828729282,
"acc_stderr": 0.013787257285896245
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.00410662063774967
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Srijan15/unit_tests_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9118
num_examples: 5
download_size: 15171
dataset_size: 9118
---
# Dataset Card for "unit_tests_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chihoonlee10__T3Q-Mistral-Orca-Math-DPO | ---
pretty_name: Evaluation run of chihoonlee10/T3Q-Mistral-Orca-Math-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chihoonlee10/T3Q-Mistral-Orca-Math-DPO](https://huggingface.co/chihoonlee10/T3Q-Mistral-Orca-Math-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chihoonlee10__T3Q-Mistral-Orca-Math-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T11:23:06.810128](https://huggingface.co/datasets/open-llm-leaderboard/details_chihoonlee10__T3Q-Mistral-Orca-Math-DPO/blob/main/results_2024-03-14T11-23-06.810128.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6509552315782638,\n\
\ \"acc_stderr\": 0.032115991758341594,\n \"acc_norm\": 0.6498675437816118,\n\
\ \"acc_norm_stderr\": 0.03279412148246691,\n \"mc1\": 0.6389228886168911,\n\
\ \"mc1_stderr\": 0.01681431284483688,\n \"mc2\": 0.7841207838591833,\n\
\ \"mc2_stderr\": 0.013603836368242053\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.719079864568811,\n\
\ \"acc_stderr\": 0.004485300194072271,\n \"acc_norm\": 0.8922525393347939,\n\
\ \"acc_norm_stderr\": 0.003094275186361528\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n\
\ \"acc_stderr\": 0.016525425898773503,\n \"acc_norm\": 0.423463687150838,\n\
\ \"acc_norm_stderr\": 0.016525425898773503\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6389228886168911,\n\
\ \"mc1_stderr\": 0.01681431284483688,\n \"mc2\": 0.7841207838591833,\n\
\ \"mc2_stderr\": 0.013603836368242053\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624184\n }\n}\n```"
repo_url: https://huggingface.co/chihoonlee10/T3Q-Mistral-Orca-Math-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|arc:challenge|25_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|gsm8k|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hellaswag|10_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T11-23-06.810128.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T11-23-06.810128.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- '**/details_harness|winogrande|5_2024-03-14T11-23-06.810128.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T11-23-06.810128.parquet'
- config_name: results
data_files:
- split: 2024_03_14T11_23_06.810128
path:
- results_2024-03-14T11-23-06.810128.parquet
- split: latest
path:
- results_2024-03-14T11-23-06.810128.parquet
---
# Dataset Card for Evaluation run of chihoonlee10/T3Q-Mistral-Orca-Math-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chihoonlee10/T3Q-Mistral-Orca-Math-DPO](https://huggingface.co/chihoonlee10/T3Q-Mistral-Orca-Math-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chihoonlee10__T3Q-Mistral-Orca-Math-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T11:23:06.810128](https://huggingface.co/datasets/open-llm-leaderboard/details_chihoonlee10__T3Q-Mistral-Orca-Math-DPO/blob/main/results_2024-03-14T11-23-06.810128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6509552315782638,
"acc_stderr": 0.032115991758341594,
"acc_norm": 0.6498675437816118,
"acc_norm_stderr": 0.03279412148246691,
"mc1": 0.6389228886168911,
"mc1_stderr": 0.01681431284483688,
"mc2": 0.7841207838591833,
"mc2_stderr": 0.013603836368242053
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.719079864568811,
"acc_stderr": 0.004485300194072271,
"acc_norm": 0.8922525393347939,
"acc_norm_stderr": 0.003094275186361528
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773503,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6389228886168911,
"mc1_stderr": 0.01681431284483688,
"mc2": 0.7841207838591833,
"mc2_stderr": 0.013603836368242053
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hugginglearners/data-science-job-salaries | ---
license:
- cc0-1.0
kaggle_id: ruchi798/data-science-job-salaries
---
# Dataset Card for Data Science Job Salaries
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://kaggle.com/datasets/ruchi798/data-science-job-salaries
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
### Content
| Column | Description |
|--------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| work_year | The year the salary was paid. |
| experience_level | The experience level in the job during the year with the following possible values: EN Entry-level / Junior MI Mid-level / Intermediate SE Senior-level / Expert EX Executive-level / Director |
| employment_type | The type of employement for the role: PT Part-time FT Full-time CT Contract FL Freelance |
| job_title | The role worked in during the year. |
| salary | The total gross salary amount paid. |
| salary_currency | The currency of the salary paid as an ISO 4217 currency code. |
| salary_in_usd | The salary in USD (FX rate divided by avg. USD rate for the respective year via fxdata.foorilla.com). |
| employee_residence | Employee's primary country of residence in during the work year as an ISO 3166 country code. |
| remote_ratio | The overall amount of work done remotely, possible values are as follows: 0 No remote work (less than 20%) 50 Partially remote 100 Fully remote (more than 80%) |
| company_location | The country of the employer's main office or contracting branch as an ISO 3166 country code. |
| company_size | The average number of people that worked for the company during the year: S less than 50 employees (small) M 50 to 250 employees (medium) L more than 250 employees (large) |
### Acknowledgements
I'd like to thank ai-jobs.net Salaries for aggregating this data!
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by [@ruchi798](https://kaggle.com/ruchi798)
### Licensing Information
The license for this dataset is cc0-1.0
### Citation Information
```bibtex
[More Information Needed]
```
### Contributions
[More Information Needed] |
octoz/Dominguinhos | ---
license: cc-by-3.0
---
|
suvadityamuk/image-generation-prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 5261
num_examples: 29
download_size: 5251
dataset_size: 5261
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Seanxh/twitter_dataset_1713192500 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 54006
num_examples: 124
download_size: 24135
dataset_size: 54006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2 | ---
pretty_name: Evaluation run of Sao10K/Fimbulvetr-11B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-16T11:41:30.859795](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2/blob/main/results_2024-03-16T11-41-30.859795.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6710297689958459,\n\
\ \"acc_stderr\": 0.03151550667899731,\n \"acc_norm\": 0.6724350896358895,\n\
\ \"acc_norm_stderr\": 0.0321521489538622,\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6342749025395696,\n\
\ \"mc2_stderr\": 0.0156107236020673\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441372,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.01337407861506874\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.696673969328819,\n\
\ \"acc_stderr\": 0.00458755357710126,\n \"acc_norm\": 0.877912766381199,\n\
\ \"acc_norm_stderr\": 0.00326717445844976\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.02141724293632158,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.02141724293632158\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846322,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846322\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884864,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884864\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026622,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026622\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4983240223463687,\n\
\ \"acc_stderr\": 0.016722407608296398,\n \"acc_norm\": 0.4983240223463687,\n\
\ \"acc_norm_stderr\": 0.016722407608296398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5058670143415906,\n\
\ \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.5058670143415906,\n\
\ \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789513,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174927,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174927\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6342749025395696,\n\
\ \"mc2_stderr\": 0.0156107236020673\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6467020470053071,\n \
\ \"acc_stderr\": 0.013166337192115683\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Fimbulvetr-11B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|arc:challenge|25_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|arc:challenge|25_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|gsm8k|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|gsm8k|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hellaswag|10_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hellaswag|10_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T11-33-56.371102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T11-41-30.859795.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T11-41-30.859795.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- '**/details_harness|winogrande|5_2024-03-16T11-33-56.371102.parquet'
- split: 2024_03_16T11_41_30.859795
path:
- '**/details_harness|winogrande|5_2024-03-16T11-41-30.859795.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-16T11-41-30.859795.parquet'
- config_name: results
data_files:
- split: 2024_03_16T11_33_56.371102
path:
- results_2024-03-16T11-33-56.371102.parquet
- split: 2024_03_16T11_41_30.859795
path:
- results_2024-03-16T11-41-30.859795.parquet
- split: latest
path:
- results_2024-03-16T11-41-30.859795.parquet
---
# Dataset Card for Evaluation run of Sao10K/Fimbulvetr-11B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-16T11:41:30.859795](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Fimbulvetr-11B-v2/blob/main/results_2024-03-16T11-41-30.859795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6710297689958459,
"acc_stderr": 0.03151550667899731,
"acc_norm": 0.6724350896358895,
"acc_norm_stderr": 0.0321521489538622,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6342749025395696,
"mc2_stderr": 0.0156107236020673
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441372,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.01337407861506874
},
"harness|hellaswag|10": {
"acc": 0.696673969328819,
"acc_stderr": 0.00458755357710126,
"acc_norm": 0.877912766381199,
"acc_norm_stderr": 0.00326717445844976
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632158,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632158
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846322,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884864,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884864
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026622,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026622
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500107,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500107
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4983240223463687,
"acc_stderr": 0.016722407608296398,
"acc_norm": 0.4983240223463687,
"acc_norm_stderr": 0.016722407608296398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7716049382716049,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.7716049382716049,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5058670143415906,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.5058670143415906,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789513,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174927,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174927
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6342749025395696,
"mc2_stderr": 0.0156107236020673
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825912
},
"harness|gsm8k|5": {
"acc": 0.6467020470053071,
"acc_stderr": 0.013166337192115683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp | ---
pretty_name: Evaluation run of Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T03:42:19.232314](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2024-02-02T03-42-19.232314.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530732061402786,\n\
\ \"acc_stderr\": 0.031986064565857564,\n \"acc_norm\": 0.6546095792380836,\n\
\ \"acc_norm_stderr\": 0.0326302871117009,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5684120643866822,\n\
\ \"mc2_stderr\": 0.015214628002199675\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.655646285600478,\n\
\ \"acc_stderr\": 0.004741859753178433,\n \"acc_norm\": 0.8500298745269866,\n\
\ \"acc_norm_stderr\": 0.0035631244274585173\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n\
\ \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n\
\ \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n\
\ \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033467,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n\
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521271,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521271\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990932,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990932\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323788,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323788\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.012759117066518012,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.012759117066518012\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700033,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700033\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5684120643866822,\n\
\ \"mc2_stderr\": 0.015214628002199675\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515314\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6497346474601972,\n \
\ \"acc_stderr\": 0.013140409455571276\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-42-19.232314.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- '**/details_harness|winogrande|5_2024-02-02T03-42-19.232314.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T03-42-19.232314.parquet'
- config_name: results
data_files:
- split: 2024_02_02T03_42_19.232314
path:
- results_2024-02-02T03-42-19.232314.parquet
- split: latest
path:
- results_2024-02-02T03-42-19.232314.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp](https://huggingface.co/Weyaxi/Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T03:42:19.232314](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Newton-OpenHermes-2.5-neural-chat-v3-3-Slerp/blob/main/results_2024-02-02T03-42-19.232314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530732061402786,
"acc_stderr": 0.031986064565857564,
"acc_norm": 0.6546095792380836,
"acc_norm_stderr": 0.0326302871117009,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5684120643866822,
"mc2_stderr": 0.015214628002199675
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.655646285600478,
"acc_stderr": 0.004741859753178433,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.0035631244274585173
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638629
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091826,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091826
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033467,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083018,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521271,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521271
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990932,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990932
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323788,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323788
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518012,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518012
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144717,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144717
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700033,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700033
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5684120643866822,
"mc2_stderr": 0.015214628002199675
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515314
},
"harness|gsm8k|5": {
"acc": 0.6497346474601972,
"acc_stderr": 0.013140409455571276
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sberhe/2023-1000-software-release-notes | ---
license: cc
---
|
nguyenthanhdo/zac2023-voice | ---
license: mit
---
|
alagaesia/auto-sql-create-context | ---
license: agpl-3.0
---
|
open-llm-leaderboard/details_KKare__Misgit-7B-slerp | ---
pretty_name: Evaluation run of KKare/Misgit-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KKare/Misgit-7B-slerp](https://huggingface.co/KKare/Misgit-7B-slerp) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KKare__Misgit-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T23:36:00.829962](https://huggingface.co/datasets/open-llm-leaderboard/details_KKare__Misgit-7B-slerp/blob/main/results_2024-04-09T23-36-00.829962.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653053157795838,\n\
\ \"acc_stderr\": 0.032022087294985284,\n \"acc_norm\": 0.6551016327182547,\n\
\ \"acc_norm_stderr\": 0.03266594180149566,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5584739322380099,\n\
\ \"mc2_stderr\": 0.015134730682201182\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.01413770860175909,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6602270464050985,\n\
\ \"acc_stderr\": 0.004726640532562037,\n \"acc_norm\": 0.855008962358096,\n\
\ \"acc_norm_stderr\": 0.003513722251954683\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941187,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941187\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834836,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834836\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5584739322380099,\n\
\ \"mc2_stderr\": 0.015134730682201182\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.013428382481274242\n }\n}\n```"
repo_url: https://huggingface.co/KKare/Misgit-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-36-00.829962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-36-00.829962.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- '**/details_harness|winogrande|5_2024-04-09T23-36-00.829962.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T23-36-00.829962.parquet'
- config_name: results
data_files:
- split: 2024_04_09T23_36_00.829962
path:
- results_2024-04-09T23-36-00.829962.parquet
- split: latest
path:
- results_2024-04-09T23-36-00.829962.parquet
---
# Dataset Card for Evaluation run of KKare/Misgit-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KKare/Misgit-7B-slerp](https://huggingface.co/KKare/Misgit-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KKare__Misgit-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T23:36:00.829962](https://huggingface.co/datasets/open-llm-leaderboard/details_KKare__Misgit-7B-slerp/blob/main/results_2024-04-09T23-36-00.829962.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653053157795838,
"acc_stderr": 0.032022087294985284,
"acc_norm": 0.6551016327182547,
"acc_norm_stderr": 0.03266594180149566,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5584739322380099,
"mc2_stderr": 0.015134730682201182
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.01413770860175909,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441374
},
"harness|hellaswag|10": {
"acc": 0.6602270464050985,
"acc_stderr": 0.004726640532562037,
"acc_norm": 0.855008962358096,
"acc_norm_stderr": 0.003513722251954683
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941187,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941187
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834836,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834836
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659856,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5584739322380099,
"mc2_stderr": 0.015134730682201182
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274242
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
polinaeterna/test_splits | ---
dataset_info:
features:
- name: x
dtype: int64
- name: y
dtype: string
splits:
- name: train
num_bytes: 116
num_examples: 8
- name: test
num_bytes: 46
num_examples: 3
download_size: 1698
dataset_size: 162
---
# Dataset Card for "test_splits"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dan-Kos/arxivannotations | ---
license: mit
task_categories:
- summarization
language:
- en
size_categories:
- 1M<n<10M
---
| Title | Annotation | PDF | Latex |
|:-------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------|:--------|
| Axion bremsstrahlung from collisions of global strings | We calculate axion radiation emitted in the collision of two straight globalstrings. The strings are supposed to be in the unexcited ground state, to beinclined with respect to each other, and to move in parallel planes. Radiationarises when the point of minimal separation between the strings moves fasterthan light. This effect exhibits a typical Cerenkov nature. Surprisingly, itallows an alternative interpretation as bremsstrahlung under a collision ofpoint charges in 2+1 electrodynamics. This can be demonstrated by suitableworld-sheet reparameterizations and dimensional reduction. Cosmologicalestimates show that our mechanism generates axion production comparable withthat from the oscillating string loops and may lead to further restrictions onthe axion window.... | https://export.arxiv.org/pdf/astro-ph/0310718 | \... |
This dataset consists of many csv format files, the name of each of which contains the category of scientific articles presented in this file. Each file consists of 1024 articles.
The first column is Title, which is the title of the text. The format of this cell is string.
The next column is Annotation, which is an annotation of the text. The format of this cell is string.
The next column is PDF, which is a link to the PDF file of this article. The format of this cell is string.
The last column is Latex, which is the text of the article in tex format. The format of this cell is string. |
peihaowang/edgnn-hypergraph-dataset | ---
license: mit
---
# Equivariant Hypergraph Diffusion Neural Operators
The official data release of ICLR 2023 paper [Equivariant Hypergraph Diffusion Neural Operators](https://arxiv.org/abs/2207.06680).
Peihao Wang, Shenghao Yang, Yunyu Liu, Zhangyang (Atlas) Wang, Pan Li
Please refer to our [GitHub repo](https://github.com/Graph-COM/ED-HNN) for more details.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.