id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
BaekRok/vishing_data_3 | 2023-08-19T23:03:44.000Z | [
"region:us"
] | BaekRok | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: labels
list:
- name: start
dtype: float64
- name: text
dtype: string
- name: end
dtype: float64
- name: speaker
dtype: string
- name: label
dtype: string
- name: seg_num
dtype: int64
- name: total_seg
dtype: int64
- name: prob
dtype: float64
splits:
- name: train
num_bytes: 63272599622.8
num_examples: 16496
- name: validation
num_bytes: 8038911926.9
num_examples: 2071
- name: test
num_bytes: 8566637155.26
num_examples: 2156
download_size: 7301508547
dataset_size: 79878148704.95999
---
# Dataset Card for "vishing_data_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thomasavare/italian-dataset-deepl | 2023-08-21T10:48:24.000Z | [
"language:en",
"language:it",
"region:us"
] | thomasavare | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: english
dtype: string
- name: italian
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: train
num_bytes: 62294
num_examples: 500
download_size: 22849
dataset_size: 62294
language:
- en
- it
---
# Dataset Card for "italian-dataset-deepl"
English to italian translation was made with Deepl API.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/openreview_raw_65 | 2023-08-19T15:13:04.000Z | [
"license:mit",
"region:us"
] | dim | null | null | null | 0 | 0 | ---
license: mit
dataset_info:
features:
- name: paper_url
dtype: string
- name: paper_id
dtype: string
- name: arxiv_link
dtype: string
- name: reviews
list:
- name: cdate
dtype: int64
- name: content
struct:
- name: confidence
dtype: string
- name: nominate_for_a_reproducibility_award
dtype: string
- name: rating
dtype: string
- name: review
dtype: string
- name: reviews_visibility
dtype: string
- name: title
dtype: string
- name: ddate
dtype: 'null'
- name: forum
dtype: string
- name: id
dtype: string
- name: invitation
dtype: string
- name: mdate
dtype: int64
- name: nonreaders
sequence: 'null'
- name: number
dtype: int64
- name: original
dtype: 'null'
- name: readers
sequence: string
- name: replyto
dtype: string
- name: signatures
sequence: string
- name: tcdate
dtype: int64
- name: tddate
dtype: 'null'
- name: tmdate
dtype: int64
- name: writers
sequence: string
- name: latex
dtype: string
splits:
- name: train
num_bytes: 3115419
num_examples: 65
download_size: 1491308
dataset_size: 3115419
---
|
jemmalmeida/json-pub | 2023-08-19T15:19:12.000Z | [
"region:us"
] | jemmalmeida | null | null | null | 0 | 0 | Entry not found |
jsn27/medical_faq | 2023-08-19T15:26:46.000Z | [
"license:mit",
"region:us"
] | jsn27 | null | null | null | 0 | 0 | ---
license: mit
---
|
NarchAI1992/2 | 2023-08-19T15:35:42.000Z | [
"license:openrail",
"region:us"
] | NarchAI1992 | null | null | null | 0 | 0 | ---
license: openrail
---
|
MateiCv/spa-eng_pos_tagging | 2023-08-19T15:37:22.000Z | [
"region:us"
] | MateiCv | null | null | null | 0 | 0 | Entry not found |
AI-C/rvc-data | 2023-08-19T15:42:49.000Z | [
"region:us"
] | AI-C | null | null | null | 0 | 0 | Entry not found |
miesepeter/CEFR_vocab_tokens | 2023-09-09T22:33:16.000Z | [
"license:other",
"region:us"
] | miesepeter | null | null | null | 0 | 0 | ---
license: other
---
Dataset with English words classified along CEFR categories, tokenized forms based on sentencepiece tokenizer.
License based on foundational dataset, accessible at: http://www.englishprofile.org/wordlists/terms-of-use |
amrllama/jobs | 2023-08-19T15:46:57.000Z | [
"region:us"
] | amrllama | null | null | null | 0 | 0 | |
xuantruong/musicaiz | 2023-08-19T16:11:31.000Z | [
"region:us"
] | xuantruong | null | null | null | 0 | 0 | Entry not found |
GarlicBread99/Stormtrooper | 2023-08-19T15:50:42.000Z | [
"region:us"
] | GarlicBread99 | null | null | null | 0 | 0 | Entry not found |
KatMarie/sv_corpora_parliament_processed | 2023-08-19T16:04:11.000Z | [
"region:us"
] | KatMarie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 292351437
num_examples: 1892723
download_size: 161955796
dataset_size: 292351437
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sv_corpora_parliament_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harshasurampudi/legal-reasoning-lfqa-synthetic | 2023-08-19T16:21:42.000Z | [
"size_categories:10K<n<100K",
"region:us"
] | harshasurampudi | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: Context
dtype: string
- name: Question
dtype: string
- name: Legal Reasoning
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 31911499
num_examples: 14991
- name: test
num_bytes: 3176252
num_examples: 1497
- name: validation
num_bytes: 3186381
num_examples: 1496
download_size: 21924127
dataset_size: 38274132
size_categories:
- 10K<n<100K
---
# Dataset Card for "legal-reasoning-lfqa-synthetic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bsmock/ICDAR-2013.c | 2023-09-07T04:49:39.000Z | [
"license:cdla-permissive-2.0",
"table structure recognition",
"table extraction",
"arxiv:2303.00716",
"region:us"
] | bsmock | null | null | null | 0 | 0 | ---
license: cdla-permissive-2.0
tags:
- table structure recognition
- table extraction
---
# ICDAR-2013.c
The ICDAR-2013.c dataset was released in 2023.
You can think of ICDAR-2013.c as a fork (a modified version, in this case by different authors) of the original ICDAR-2013 dataset from the ICDAR 2013 Table Competition.
It contains:
- manual corrections to minor annotation mistakes in the original dataset
- automated corrections (such as canonicalization) to correct oversegmentation and to make the dataset more consistent with other TSR datasets, like PubTables-1M
For more details about this version (2023) of the dataset and the manual corrections made to the original dataset, please see ["Aligning benchmark datasets for table structure recognition"](https://arxiv.org/abs/2303.00716).
For the code used to create this dataset, see [https://github.com/microsoft/table-transformer](https://github.com/microsoft/table-transformer).
## Citing
If you use this dataset in your published work, please cite:
```
@article{smock2023aligning,
title={Aligning benchmark datasets for table structure recognition},
author={Smock, Brandon and Pesala, Rohith and Abraham, Robin},
booktitle={International Conference on Document Analysis and Recognition},
pages={371--386},
year={2023},
organization={Springer}
}
```
## About the original IDCAR-2013 dataset
The original dataset was released as part of the ICDAR 2013 Table Competition.
It can be downloaded [here](https://roundtrippdf.com/en/downloads/) but as of August 2023 accessing the files returns a 403 Forbidden error.
We release a copy of the original dataset but with manual corrections to fix minor annotation mistakes [here](https://huggingface.co/datasets/bsmock/ICDAR-2013-Table-Competition-Corrected).
### Original license
There is no known license for the original dataset, but the data is commonly referred to as "public", and so we interpret this to mean there are no license restrictions on the original data.
According to [this website](https://roundtrippdf.com/en/data-extraction/pdf-table-recognition-dataset/) from Tamir Hassan (as of August 2023): "These documents have been collected systematically from the European Union and US Government websites, and we therefore expect them to have public domain status."
Associated [code](https://github.com/tamirhassan/dataset-tools) for the data for the 2013 competition carries an Apache-2.0 license. |
indavedea/online | 2023-08-19T19:17:49.000Z | [
"region:us"
] | indavedea | null | null | null | 0 | 0 | Entry not found |
cg4/louxtest | 2023-08-19T16:32:32.000Z | [
"region:us"
] | cg4 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 35537.014925373136
num_examples: 120
- name: validation
num_bytes: 4145.985074626866
num_examples: 14
download_size: 31049
dataset_size: 39683.0
---
# Dataset Card for "louxtest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Superdetec/ghoul | 2023-08-19T16:36:38.000Z | [
"license:openrail",
"region:us"
] | Superdetec | null | null | null | 0 | 0 | ---
license: openrail
---
|
Jing24/generate_sub_5 | 2023-08-19T16:38:17.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 34459076
num_examples: 38107
download_size: 6236290
dataset_size: 34459076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
uwunion/instruct_svg | 2023-08-20T05:47:42.000Z | [
"license:cc",
"region:us"
] | uwunion | null | null | null | 2 | 0 | ---
license: cc
dataset_info:
features:
- name: image
dtype: image
- name: input
dtype: string
- name: output
dtype: string
- name: description_0
dtype: string
- name: description_1
dtype: string
splits:
- name: train
num_bytes: 8627552.0
num_examples: 617
download_size: 7810230
dataset_size: 8627552.0
---
|
usernamedesu/aichan-public-v3 | 2023-08-19T16:49:28.000Z | [
"region:us"
] | usernamedesu | null | null | null | 0 | 0 | Entry not found |
skuenzli/FluidSimulation | 2023-08-19T16:52:51.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | skuenzli | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
|
optimacare/testdataset | 2023-08-19T17:04:42.000Z | [
"region:us"
] | optimacare | null | null | null | 0 | 0 | Entry not found |
Matias12f/cats_dogs | 2023-08-19T17:00:11.000Z | [
"license:apache-2.0",
"region:us"
] | Matias12f | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
safgasgfsa/ClosingThemeVoiceModel | 2023-08-19T17:12:26.000Z | [
"region:us"
] | safgasgfsa | null | null | null | 0 | 0 | Entry not found |
qwerty123123/venv | 2023-08-19T17:47:57.000Z | [
"license:apache-2.0",
"region:us"
] | qwerty123123 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
cabrooks/levenshtein_filter_50k_wordpiece | 2023-08-20T21:36:26.000Z | [
"license:openrail",
"region:us"
] | cabrooks | null | null | null | 0 | 0 | ---
license: openrail
---
|
yzhuang/autotree_nxor_l1_128 | 2023-08-19T18:21:32.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 28890800000
num_examples: 100000
- name: validation
num_bytes: 2889080000
num_examples: 10000
- name: test
num_bytes: 2889080000
num_examples: 10000
download_size: 33056635077
dataset_size: 34668960000
---
# Dataset Card for "autotree_nxor_l1_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SURF-FluidSimulation/Test | 2023-08-21T12:45:32.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | SURF-FluidSimulation | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
|
Glavin001/generate-questions-v0.1 | 2023-08-19T18:22:06.000Z | [
"region:us"
] | Glavin001 | null | null | null | 1 | 0 | Entry not found |
fireworks-ai/msmarco_rank | 2023-08-19T18:37:52.000Z | [
"region:us"
] | fireworks-ai | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: train
num_bytes: 13976268494
num_examples: 398792
download_size: 7376188746
dataset_size: 13976268494
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "msmarco_rank"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
optimacare/dolly_test | 2023-08-19T18:34:49.000Z | [
"region:us"
] | optimacare | null | null | null | 0 | 0 | Entry not found |
POSSUMCho/NanuakComparisons | 2023-08-19T18:51:38.000Z | [
"license:unknown",
"region:us"
] | POSSUMCho | null | null | null | 0 | 0 | ---
license: unknown
---
|
yzhuang/autotree_snxor_l1_128 | 2023-08-19T19:42:40.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence:
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 14452400000
num_examples: 100000
- name: validation
num_bytes: 1445240000
num_examples: 10000
- name: test
num_bytes: 1445240000
num_examples: 10000
download_size: 16568698860
dataset_size: 17342880000
---
# Dataset Card for "autotree_snxor_l1_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
paulinaaguilera/trabajo_final | 2023-08-19T19:42:29.000Z | [
"license:apache-2.0",
"region:us"
] | paulinaaguilera | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
mbeaty2/data | 2023-08-19T19:42:09.000Z | [
"region:us"
] | mbeaty2 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 2929210.0
num_examples: 127
download_size: 1529434
dataset_size: 2929210.0
---
# Dataset Card for "data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
simonveitner/finetune_trial | 2023-08-19T19:53:25.000Z | [
"region:us"
] | simonveitner | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 124231
num_examples: 250
download_size: 0
dataset_size: 124231
---
# Dataset Card for "finetune_trial"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jabuticaba-br/codeinquero-cursos-graduacao | 2023-08-19T20:07:31.000Z | [
"region:us"
] | jabuticaba-br | null | null | null | 0 | 0 | Entry not found |
Jing24/generate_sub_6 | 2023-08-19T20:02:55.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 27347294
num_examples: 30298
download_size: 4949263
dataset_size: 27347294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Disfluency/disfluency-es-16k-v2 | 2023-08-19T20:33:28.000Z | [
"region:us"
] | Disfluency | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 62314644.0
num_examples: 780
- name: test
num_bytes: 8874608.0
num_examples: 120
download_size: 65179414
dataset_size: 71189252.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "disfluency-es-16k-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Briqz23/instituto_uno | 2023-08-19T20:34:37.000Z | [
"region:us"
] | Briqz23 | null | null | null | 0 | 0 | "<HUMAN> I'd like to know more about Uno's institute, <ASSISTANT>: Instituto Uno offers specialized education for adolescents and children in institutional care and who have a significant gap in education, starting with literacy."
"<HUMAN> I'd like to know about your mission, <ASSISTANT>: Our mission is to bring an innovative and extremely effective education to all adolescents and children in situations of social vulnerability, based on principles such as respect, listening, feeling, collaborating, affection, love."
"<HUMAN> What is Uno's Institute?,<ASSISTANT>: Instituto Uno is an education program focused on literacyfor children and adolescents sheltered in shelters, who are past the ideal literacy age."
"<HUMAN> What is the relationship between quero saber valor and Uno's institute?, <ASSISTANT>: Instituto Uno has the Quero Saber program, on literacy, and the Quero Saber Valor program, on finance."
"<HUMAN> Why should I donate to Uno's institute? <ASSISTANT>: Uno's institute is financial education project that addresses two fundamental aspects, personal finance and behavioral finance, and offers opportunities for teenagers to organize and plan financially."
"<HUMAN> What have you done till now?, <ASSISTANT>: Since 2011, we have served 310 children and adolescents in 28 reception services, carrying out around 5,000 'in loco' educational activities, complemented with 17 excursions and 41 external exploratory studies, with the support of 170 dedicated and committed eco-educator volunteers, working weekly for over one year continuously."
"<HUMAN> What's your belief?, <ASSISTANT>: If you believe that educating is everyone's responsibility, Instituto Uno makes this possibility something possible and accessible to anyone who is willing to go through a transforming personal experience. Go to the website and get in touch."
"<HUMAN> I would like to make a donation,You can make your donation through the Instituto Uno website or by asking me to make the donation. How would you like to do it?"
"<HUMAN> What can you do?, <ASSISTANT>: I can tell you Uno's institute, why should you donate and their history"
|
ccml/dataset1 | 2023-09-11T01:16:05.000Z | [
"task_categories:conversational",
"size_categories:n<1K",
"license:mit",
"code",
"region:us"
] | ccml | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- conversational
tags:
- code
size_categories:
- n<1K
pretty_name: mydata-1
--- |
Briqz23/testeasdasd | 2023-08-19T20:46:17.000Z | [
"region:us"
] | Briqz23 | null | null | null | 0 | 0 | Entry not found |
matiasosorio/perro_y_no_perro | 2023-08-19T21:13:51.000Z | [
"region:us"
] | matiasosorio | null | null | null | 0 | 0 | metadata
annotations_creators:
- found
language: []
language_creators: []
license:
- apache-2.0
multilinguality: []
pretty_name: Perritos-y-no-Perritos
size_categories:
- n<1K
source_datasets:
- original
tags:
- animals
- dogs
- creature-dataset
task_categories:
- image-classification
task_ids:
- binary-class-image-classification |
Sedigh/DB-Mvec | 2023-08-19T21:09:13.000Z | [
"region:us"
] | Sedigh | null | null | null | 0 | 0 | Entry not found |
IUSEGPTLOL/LLM | 2023-08-19T21:13:19.000Z | [
"license:pddl",
"region:us"
] | IUSEGPTLOL | null | null | null | 0 | 0 | ---
license: pddl
---
|
RealTimeData/github_latest | 2023-10-09T00:32:48.000Z | [
"region:us"
] | RealTimeData | null | null | null | 0 | 0 | ---
{}
---
# Latest GitHub Repositories
You could always access the latest Github repos via this dataset.
We update the dataset weekly, on every Sunday. So the dataset always provides the latest Github repos from the last week.
The current dataset on main branch contains the latest Github Repos submitted from 2023-09-25 to 2023-10-02.
The data collection is conducted on 2023-10-09.
Use the dataset via:
```
ds = datasets.load_dataset('RealTimeData/github_latest')
```
# Previsou versions
You could access previous versions by requesting different branches.
For example, you could find the 2023-08-06 version via:
```
ds = datasets.load_dataset('RealTimeData/github_latest', revision = '2023-08-06')
```
Check all available versions by clicking the "Files and versions" button on the top bar.
|
open-llm-leaderboard/details_huggyllama__llama-7b | 2023-08-28T19:49:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 119 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggyllama__llama-7b\"\
,\n\t\"original_mmlu_world_religions_5\",\n\tsplit=\"train\")\n```\n\n## Latest\
\ results\n\nThese are the [latest results from run 2023-08-28T19:48:59.844014](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-7b/blob/main/results_2023-08-28T19%3A48%3A59.844014.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.35558052524707073,\n\
\ \"acc_stderr\": 0.03544077777004337\n },\n \"original|mmlu:abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081\n },\n\
\ \"original|mmlu:anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \
\ \"acc_stderr\": 0.04188307537595853\n },\n \"original|mmlu:astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.038781398887976104\n\
\ },\n \"original|mmlu:business_ethics|5\": {\n \"acc\": 0.41,\n \
\ \"acc_stderr\": 0.049431107042371025\n },\n \"original|mmlu:clinical_knowledge|5\"\
: {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.02914690474779833\n\
\ },\n \"original|mmlu:college_biology|5\": {\n \"acc\": 0.375,\n \
\ \"acc_stderr\": 0.04048439222695598\n },\n \"original|mmlu:college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845\n },\n\
\ \"original|mmlu:college_computer_science|5\": {\n \"acc\": 0.3,\n \
\ \"acc_stderr\": 0.046056618647183814\n },\n \"original|mmlu:college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236\n },\n\
\ \"original|mmlu:college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.0358390175473641\n },\n \"original|mmlu:college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865\n\
\ },\n \"original|mmlu:computer_security|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05\n },\n \"original|mmlu:conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291\n\
\ },\n \"original|mmlu:econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022055\n },\n \"original|mmlu:electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914\n\
\ },\n \"original|mmlu:elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n\
\ \"acc_stderr\": 0.02271746789770862\n },\n \"original|mmlu:formal_logic|5\"\
: {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604673\n\
\ },\n \"original|mmlu:global_facts|5\": {\n \"acc\": 0.31,\n \
\ \"acc_stderr\": 0.04648231987117316\n },\n \"original|mmlu:high_school_biology|5\"\
: {\n \"acc\": 0.33225806451612905,\n \"acc_stderr\": 0.026795560848122797\n\
\ },\n \"original|mmlu:high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n\
\ \"acc_stderr\": 0.03144712581678241\n },\n \"original|mmlu:high_school_computer_science|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505\n },\n\
\ \"original|mmlu:high_school_european_history|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.03851716319398393\n },\n \"original|mmlu:high_school_geography|5\"\
: {\n \"acc\": 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070643\n\
\ },\n \"original|mmlu:high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.03594413711272435\n\
\ },\n \"original|mmlu:high_school_macroeconomics|5\": {\n \"acc\"\
: 0.3384615384615385,\n \"acc_stderr\": 0.023991500500313036\n },\n \
\ \"original|mmlu:high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n\
\ \"acc_stderr\": 0.0263357394040558\n },\n \"original|mmlu:high_school_microeconomics|5\"\
: {\n \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.0302839955258844\n\
\ },\n \"original|mmlu:high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n\
\ \"acc_stderr\": 0.036030385453603854\n },\n \"original|mmlu:high_school_psychology|5\"\
: {\n \"acc\": 0.47706422018348627,\n \"acc_stderr\": 0.0214147570581755\n\
\ },\n \"original|mmlu:high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.03191923445686186\n },\n \"original|mmlu:high_school_us_history|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.03426712349247271\n\
\ },\n \"original|mmlu:high_school_world_history|5\": {\n \"acc\":\
\ 0.4388185654008439,\n \"acc_stderr\": 0.032302649315470375\n },\n \
\ \"original|mmlu:human_aging|5\": {\n \"acc\": 0.3991031390134529,\n \
\ \"acc_stderr\": 0.032867453125679603\n },\n \"original|mmlu:human_sexuality|5\"\
: {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.0418644516301375\n\
\ },\n \"original|mmlu:international_law|5\": {\n \"acc\": 0.512396694214876,\n\
\ \"acc_stderr\": 0.04562951548180765\n },\n \"original|mmlu:jurisprudence|5\"\
: {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.04766075165356461\n\
\ },\n \"original|mmlu:logical_fallacies|5\": {\n \"acc\": 0.4233128834355828,\n\
\ \"acc_stderr\": 0.038818912133343826\n },\n \"original|mmlu:machine_learning|5\"\
: {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256\n\
\ },\n \"original|mmlu:management|5\": {\n \"acc\": 0.3300970873786408,\n\
\ \"acc_stderr\": 0.046561471100123514\n },\n \"original|mmlu:marketing|5\"\
: {\n \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.032583346493868806\n\
\ },\n \"original|mmlu:medical_genetics|5\": {\n \"acc\": 0.38,\n \
\ \"acc_stderr\": 0.04878317312145633\n },\n \"original|mmlu:miscellaneous|5\"\
: {\n \"acc\": 0.4240102171136654,\n \"acc_stderr\": 0.017672263329084222\n\
\ },\n \"original|mmlu:moral_disputes|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.026362437574546538\n },\n \"original|mmlu:moral_scenarios|5\"\
: {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889\n\
\ },\n \"original|mmlu:nutrition|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.027914055510468015\n },\n \"original|mmlu:philosophy|5\"\
: {\n \"acc\": 0.40192926045016075,\n \"acc_stderr\": 0.027846476005930477\n\
\ },\n \"original|mmlu:prehistory|5\": {\n \"acc\": 0.3487654320987654,\n\
\ \"acc_stderr\": 0.02651759772446501\n },\n \"original|mmlu:professional_accounting|5\"\
: {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059\n\
\ },\n \"original|mmlu:professional_law|5\": {\n \"acc\": 0.29726205997392435,\n\
\ \"acc_stderr\": 0.01167334617308605\n },\n \"original|mmlu:professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924\n \
\ },\n \"original|mmlu:professional_psychology|5\": {\n \"acc\": 0.3545751633986928,\n\
\ \"acc_stderr\": 0.019353360547553697\n },\n \"original|mmlu:public_relations|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.0469237132203465\n },\n\
\ \"original|mmlu:security_studies|5\": {\n \"acc\": 0.3346938775510204,\n\
\ \"acc_stderr\": 0.030209235226242307\n },\n \"original|mmlu:sociology|5\"\
: {\n \"acc\": 0.4527363184079602,\n \"acc_stderr\": 0.035197027175769155\n\
\ },\n \"original|mmlu:us_foreign_policy|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589\n },\n \"original|mmlu:virology|5\"\
: {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322415\n\
\ },\n \"original|mmlu:world_religions|5\": {\n \"acc\": 0.5029239766081871,\n\
\ \"acc_stderr\": 0.03834759370936839\n }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|arc:challenge|25_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|arc:challenge|25_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hellaswag|10_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hellaswag|10_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T21:59:42.894030.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:59:33.663836.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T16:59:33.663836.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T21:59:42.894030.parquet'
- split: 2023_08_23T16_59_33.663836
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T16:59:33.663836.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T16:59:33.663836.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T19:48:59.844014.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T19:48:59.844014.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T19_48_59.844014
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:48:59.844014.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T19:48:59.844014.parquet'
- config_name: results
data_files:
- split: 2023_08_19T21_59_42.894030
path:
- results_2023-08-19T21:59:42.894030.parquet
- split: 2023_08_23T16_59_33.663836
path:
- results_2023-08-23T16:59:33.663836.parquet
- split: 2023_08_28T19_48_59.844014
path:
- results_2023-08-28T19:48:59.844014.parquet
- split: latest
path:
- results_2023-08-28T19:48:59.844014.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 119 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggyllama__llama-7b",
"original_mmlu_world_religions_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T19:48:59.844014](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-7b/blob/main/results_2023-08-28T19%3A48%3A59.844014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.35558052524707073,
"acc_stderr": 0.03544077777004337
},
"original|mmlu:abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081
},
"original|mmlu:anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853
},
"original|mmlu:astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.038781398887976104
},
"original|mmlu:business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025
},
"original|mmlu:clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.02914690474779833
},
"original|mmlu:college_biology|5": {
"acc": 0.375,
"acc_stderr": 0.04048439222695598
},
"original|mmlu:college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845
},
"original|mmlu:college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814
},
"original|mmlu:college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236
},
"original|mmlu:college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.0358390175473641
},
"original|mmlu:college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865
},
"original|mmlu:computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05
},
"original|mmlu:conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291
},
"original|mmlu:econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022055
},
"original|mmlu:electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914
},
"original|mmlu:elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770862
},
"original|mmlu:formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673
},
"original|mmlu:global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316
},
"original|mmlu:high_school_biology|5": {
"acc": 0.33225806451612905,
"acc_stderr": 0.026795560848122797
},
"original|mmlu:high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678241
},
"original|mmlu:high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505
},
"original|mmlu:high_school_european_history|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.03851716319398393
},
"original|mmlu:high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070643
},
"original|mmlu:high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.03594413711272435
},
"original|mmlu:high_school_macroeconomics|5": {
"acc": 0.3384615384615385,
"acc_stderr": 0.023991500500313036
},
"original|mmlu:high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558
},
"original|mmlu:high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.0302839955258844
},
"original|mmlu:high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603854
},
"original|mmlu:high_school_psychology|5": {
"acc": 0.47706422018348627,
"acc_stderr": 0.0214147570581755
},
"original|mmlu:high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186
},
"original|mmlu:high_school_us_history|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.03426712349247271
},
"original|mmlu:high_school_world_history|5": {
"acc": 0.4388185654008439,
"acc_stderr": 0.032302649315470375
},
"original|mmlu:human_aging|5": {
"acc": 0.3991031390134529,
"acc_stderr": 0.032867453125679603
},
"original|mmlu:human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.0418644516301375
},
"original|mmlu:international_law|5": {
"acc": 0.512396694214876,
"acc_stderr": 0.04562951548180765
},
"original|mmlu:jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461
},
"original|mmlu:logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.038818912133343826
},
"original|mmlu:machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256
},
"original|mmlu:management|5": {
"acc": 0.3300970873786408,
"acc_stderr": 0.046561471100123514
},
"original|mmlu:marketing|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.032583346493868806
},
"original|mmlu:medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633
},
"original|mmlu:miscellaneous|5": {
"acc": 0.4240102171136654,
"acc_stderr": 0.017672263329084222
},
"original|mmlu:moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546538
},
"original|mmlu:moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889
},
"original|mmlu:nutrition|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.027914055510468015
},
"original|mmlu:philosophy|5": {
"acc": 0.40192926045016075,
"acc_stderr": 0.027846476005930477
},
"original|mmlu:prehistory|5": {
"acc": 0.3487654320987654,
"acc_stderr": 0.02651759772446501
},
"original|mmlu:professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059
},
"original|mmlu:professional_law|5": {
"acc": 0.29726205997392435,
"acc_stderr": 0.01167334617308605
},
"original|mmlu:professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924
},
"original|mmlu:professional_psychology|5": {
"acc": 0.3545751633986928,
"acc_stderr": 0.019353360547553697
},
"original|mmlu:public_relations|5": {
"acc": 0.4,
"acc_stderr": 0.0469237132203465
},
"original|mmlu:security_studies|5": {
"acc": 0.3346938775510204,
"acc_stderr": 0.030209235226242307
},
"original|mmlu:sociology|5": {
"acc": 0.4527363184079602,
"acc_stderr": 0.035197027175769155
},
"original|mmlu:us_foreign_policy|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589
},
"original|mmlu:virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415
},
"original|mmlu:world_religions|5": {
"acc": 0.5029239766081871,
"acc_stderr": 0.03834759370936839
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
openaccess-ai-collective/mmlu-evals | 2023-08-19T22:56:23.000Z | [
"region:us"
] | openaccess-ai-collective | null | null | null | 1 | 0 | Entry not found |
open-llm-leaderboard/details_huggyllama__llama-30b | 2023-09-16T23:45:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of huggyllama/llama-30b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [huggyllama/llama-30b](https://huggingface.co/huggyllama/llama-30b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the agregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggyllama__llama-30b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T23:44:55.901768](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-30b/blob/main/results_2023-09-16T23-44-55.901768.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298701,\n \"f1\": 0.06332634228187943,\n\
\ \"f1_stderr\": 0.0013742294190200051,\n \"acc\": 0.47445656434133393,\n\
\ \"acc_stderr\": 0.010516415781576863\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298701,\n\
\ \"f1\": 0.06332634228187943,\n \"f1_stderr\": 0.0013742294190200051\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14859742228961334,\n \
\ \"acc_stderr\": 0.009797503180527876\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n\
\ }\n}\n```"
repo_url: https://huggingface.co/huggyllama/llama-30b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|arc:challenge|25_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|arc:challenge|25_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T23_44_55.901768
path:
- '**/details_harness|drop|3_2023-09-16T23-44-55.901768.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T23-44-55.901768.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T23_44_55.901768
path:
- '**/details_harness|gsm8k|5_2023-09-16T23-44-55.901768.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T23-44-55.901768.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hellaswag|10_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hellaswag|10_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T23:03:51.753289.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-19T23:03:51.753289.parquet'
- split: 2023_08_23T17_40_29.405074
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T17:40:29.405074.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T17:40:29.405074.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T23_44_55.901768
path:
- '**/details_harness|winogrande|5_2023-09-16T23-44-55.901768.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T23-44-55.901768.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T20:06:09.731721.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T20_06_09.731721
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T20:06:09.731721.parquet'
- config_name: results
data_files:
- split: 2023_08_19T23_03_51.753289
path:
- results_2023-08-19T23:03:51.753289.parquet
- split: 2023_08_23T17_40_29.405074
path:
- results_2023-08-23T17:40:29.405074.parquet
- split: 2023_08_28T20_06_09.731721
path:
- results_2023-08-28T20:06:09.731721.parquet
- split: 2023_09_16T23_44_55.901768
path:
- results_2023-09-16T23-44-55.901768.parquet
- split: latest
path:
- results_2023-09-16T23-44-55.901768.parquet
---
# Dataset Card for Evaluation run of huggyllama/llama-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggyllama/llama-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggyllama/llama-30b](https://huggingface.co/huggyllama/llama-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggyllama__llama-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T23:44:55.901768](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-30b/blob/main/results_2023-09-16T23-44-55.901768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298701,
"f1": 0.06332634228187943,
"f1_stderr": 0.0013742294190200051,
"acc": 0.47445656434133393,
"acc_stderr": 0.010516415781576863
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298701,
"f1": 0.06332634228187943,
"f1_stderr": 0.0013742294190200051
},
"harness|gsm8k|5": {
"acc": 0.14859742228961334,
"acc_stderr": 0.009797503180527876
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Jing24/generate_sub_7 | 2023-08-19T23:46:12.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 21022386
num_examples: 23401
download_size: 3811300
dataset_size: 21022386
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
minionai/Deberta_training_data_aug23 | 2023-08-19T23:55:48.000Z | [
"region:us"
] | minionai | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: chunk
dtype: string
- name: query
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 307042277
num_examples: 241660
- name: test
num_bytes: 12270877
num_examples: 9533
download_size: 73672404
dataset_size: 319313154
---
# Dataset Card for "Deberta_training_data_aug23"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hdudasna164/hdudasna164 | 2023-08-20T23:59:13.000Z | [
"region:us"
] | hdudasna164 | null | null | null | 0 | 0 | Entry not found |
Miyabe84/Miyabe84 | 2023-08-21T01:28:02.000Z | [
"region:us"
] | Miyabe84 | null | null | null | 0 | 0 | Entry not found |
Kagami565/Kagami565 | 2023-08-21T01:27:56.000Z | [
"region:us"
] | Kagami565 | null | null | null | 0 | 0 | Entry not found |
Nobusawa465/Nobusawa465 | 2023-08-21T01:33:04.000Z | [
"region:us"
] | Nobusawa465 | null | null | null | 0 | 0 | Entry not found |
Koungab735/Koungab735 | 2023-08-21T01:33:28.000Z | [
"region:us"
] | Koungab735 | null | null | null | 0 | 0 | Entry not found |
Midori57/Midori57 | 2023-08-21T08:32:43.000Z | [
"region:us"
] | Midori57 | null | null | null | 0 | 0 | Entry not found |
Renikawa94/Renikawa94 | 2023-08-21T08:32:40.000Z | [
"region:us"
] | Renikawa94 | null | null | null | 0 | 0 | Entry not found |
Mukai673/Mukai673 | 2023-08-21T08:39:21.000Z | [
"region:us"
] | Mukai673 | null | null | null | 0 | 0 | Entry not found |
Mizuki673/Mizuki673 | 2023-08-21T08:39:33.000Z | [
"region:us"
] | Mizuki673 | null | null | null | 0 | 0 | Entry not found |
jujnasma25/jujnasma25 | 2023-08-21T08:52:12.000Z | [
"region:us"
] | jujnasma25 | null | null | null | 0 | 0 | Entry not found |
rfashban24/rfashban24 | 2023-08-21T08:52:10.000Z | [
"region:us"
] | rfashban24 | null | null | null | 0 | 0 | Entry not found |
Teramoto853/Teramoto853 | 2023-08-21T09:00:47.000Z | [
"region:us"
] | Teramoto853 | null | null | null | 0 | 0 | Entry not found |
Shionone84/Shionone84 | 2023-08-21T09:00:44.000Z | [
"region:us"
] | Shionone84 | null | null | null | 0 | 0 | Entry not found |
Otake893/Otake893 | 2023-08-21T15:48:21.000Z | [
"region:us"
] | Otake893 | null | null | null | 0 | 0 | Entry not found |
Hifumi098/Hifumi098 | 2023-08-21T15:48:00.000Z | [
"region:us"
] | Hifumi098 | null | null | null | 0 | 0 | Entry not found |
Hamano973/Hamano973 | 2023-08-21T15:55:41.000Z | [
"region:us"
] | Hamano973 | null | null | null | 0 | 0 | Entry not found |
Rikung634/Rikung634 | 2023-08-22T00:11:34.000Z | [
"region:us"
] | Rikung634 | null | null | null | 0 | 0 | Entry not found |
Izumo842/Izumo842 | 2023-08-22T00:18:08.000Z | [
"region:us"
] | Izumo842 | null | null | null | 0 | 0 | Entry not found |
Mitsuru73/Mitsuru73 | 2023-08-22T03:28:47.000Z | [
"region:us"
] | Mitsuru73 | null | null | null | 0 | 0 | Entry not found |
Sakae56/Sakae56 | 2023-08-20T00:50:04.000Z | [
"region:us"
] | Sakae56 | null | null | null | 0 | 0 | Entry not found |
Sakae56/Sakaena56 | 2023-08-22T03:35:55.000Z | [
"region:us"
] | Sakae56 | null | null | null | 0 | 0 | Entry not found |
Muraoka843/Muraoka843 | 2023-08-22T03:35:54.000Z | [
"region:us"
] | Muraoka843 | null | null | null | 0 | 0 | Entry not found |
Toshiharu45/Toshiharu45 | 2023-08-22T03:42:12.000Z | [
"region:us"
] | Toshiharu45 | null | null | null | 0 | 0 | Entry not found |
Yokouchi83/Yokouchi83 | 2023-08-22T00:11:20.000Z | [
"region:us"
] | Yokouchi83 | null | null | null | 0 | 0 | Entry not found |
vikp/reverse_instruct | 2023-08-21T05:25:03.000Z | [
"region:us"
] | vikp | null | null | null | 2 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: kind
dtype: string
- name: prob
dtype: float64
splits:
- name: train
num_bytes: 694061788.6849711
num_examples: 613214
download_size: 372451511
dataset_size: 694061788.6849711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "reverse_instruct"
This is a reverse instruction dataset, designed to be used in cases where we're trying to predict the `instruction` given the `output`.
This is useful to train a model that can generate instructions for a raw dataset (useful to quickly instruction tune on a new domain).
This data was created by aggregating [clean alpaca data](https://github.com/gururise/AlpacaDataCleaned), [evol-instruct](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1), [clean instruct](https://huggingface.co/datasets/crumb/Clean-Instruct-3M), and [orca](https://huggingface.co/datasets/Open-Orca/OpenOrca).
The combined dataset was filtered using heuristics to remove instructions unlikely to be predictable from the output (multiple choice, continue this passage, etc.). The dataset was then filtered using [instruct_rater](https://huggingface.co/vikp/instruct_rater), a trained classifier that predicts how likely an instruction is to be able to be recreated from an output. |
Takuma67/Takuma67 | 2023-08-22T03:41:50.000Z | [
"region:us"
] | Takuma67 | null | null | null | 0 | 0 | Entry not found |
Kazuhiro35/Kazuhiro35 | 2023-08-22T04:02:09.000Z | [
"region:us"
] | Kazuhiro35 | null | null | null | 0 | 0 | Entry not found |
matiasosorio/perros_y_no_perros | 2023-08-20T01:23:45.000Z | [
"license:apache-2.0",
"region:us"
] | matiasosorio | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
fejalu/your-dataset-name | 2023-08-20T01:43:59.000Z | [
"region:us"
] | fejalu | null | null | null | 0 | 0 | Entry not found |
jemmalmeida/geraldate | 2023-08-20T02:54:48.000Z | [
"region:us"
] | jemmalmeida | null | null | null | 0 | 0 | Entry not found |
gwj/mona | 2023-08-20T02:32:26.000Z | [
"region:us"
] | gwj | null | null | null | 0 | 0 | Entry not found |
leofto/vfs | 2023-08-20T02:40:26.000Z | [
"license:bsd",
"region:us"
] | leofto | null | null | null | 0 | 0 | ---
license: bsd
---
|
ProgramComputer/test | 2023-09-06T01:53:02.000Z | [
"region:us"
] | ProgramComputer | VoxCeleb is an audio-visual dataset consisting of short clips of human speech, extracted from interview videos uploaded to YouTube | @Article{Nagrani19,
author = "Arsha Nagrani and Joon~Son Chung and Weidi Xie and Andrew Zisserman",
title = "Voxceleb: Large-scale speaker verification in the wild",
journal = "Computer Science and Language",
year = "2019",
publisher = "Elsevier",
}
@InProceedings{Chung18b,
author = "Chung, J.~S. and Nagrani, A. and Zisserman, A.",
title = "VoxCeleb2: Deep Speaker Recognition",
booktitle = "INTERSPEECH",
year = "2018",
}
@InProceedings{Nagrani17,
author = "Nagrani, A. and Chung, J.~S. and Zisserman, A.",
title = "VoxCeleb: a large-scale speaker identification dataset",
booktitle = "INTERSPEECH",
year = "2017",
} | null | 0 | 0 | |
ixelszy/detailedBG_Lora | 2023-08-20T03:04:36.000Z | [
"license:wtfpl",
"region:us"
] | ixelszy | null | null | null | 0 | 0 | ---
license: wtfpl
---
|
krvhrv/Healix-2.8B-Token-Medical-Shot | 2023-08-20T05:26:19.000Z | [
"region:us"
] | krvhrv | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 18067633753
num_examples: 1388866
download_size: 8989892446
dataset_size: 18067633753
---
# Dataset Card for "Healix-2.8B-Token-Medical-Shot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
paulinaaguilera/trabajo_final2 | 2023-08-22T02:52:33.000Z | [
"license:apache-2.0",
"region:us"
] | paulinaaguilera | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Rikah45563/jansmba63 | 2023-08-20T03:03:52.000Z | [
"region:us"
] | Rikah45563 | null | null | null | 0 | 0 | Entry not found |
eval4nlp-oom/train | 2023-08-25T06:54:10.000Z | [
"region:us"
] | eval4nlp-oom | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: score
dtype: float64
splits:
- name: summarization
num_bytes: 789401
num_examples: 320
- name: en_de
num_bytes: 2440668
num_examples: 11046
- name: zh_en
num_bytes: 4430272
num_examples: 15750
download_size: 0
dataset_size: 7660341
---
# Dataset Card for "train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eval4nlp-oom/dev | 2023-08-25T06:54:14.000Z | [
"region:us"
] | eval4nlp-oom | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: summariation
num_bytes: 3145248
num_examples: 1280
- name: en_de
num_bytes: 1539275
num_examples: 7364
- name: zh_en
num_bytes: 2885729
num_examples: 10500
download_size: 0
dataset_size: 7570252
---
# Dataset Card for "dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
klnsbing/bibg | 2023-08-20T03:09:26.000Z | [
"region:us"
] | klnsbing | null | null | null | 0 | 0 | Entry not found |
tyzhu/fw_num_bi_train_10000_eval_100 | 2023-08-21T06:35:39.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2135417
num_examples: 30200
- name: train_doc2id
num_bytes: 817094
num_examples: 10100
- name: train_id2doc
num_bytes: 847394
num_examples: 10100
- name: train_find_word
num_bytes: 470929
num_examples: 10000
- name: eval_find_word
num_bytes: 4823
num_examples: 100
download_size: 929394
dataset_size: 4275657
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
---
# Dataset Card for "fw_num_bi_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tarik2568/SentNoB | 2023-08-20T03:36:47.000Z | [
"region:us"
] | tarik2568 | null | null | null | 0 | 0 | Citation Information:
@inproceedings{islam2021sentnob,
title={SentNoB: A Dataset for Analysing Sentiment on Noisy Bangla Texts},
author={Islam, Khondoker Ittehadul and Kar, Sudipta and Islam, Md Saiful and Amin, Mohammad Ruhul},
booktitle={Findings of the Association for Computational Linguistics: EMNLP 2021},
pages={3265--3271},
year={2021}
} |
namphan1999/data-luat | 2023-08-20T03:49:28.000Z | [
"region:us"
] | namphan1999 | null | null | null | 0 | 0 | Entry not found |
Jing24/generate_sub_8 | 2023-08-20T03:53:12.000Z | [
"region:us"
] | Jing24 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 13346959
num_examples: 14670
download_size: 2474614
dataset_size: 13346959
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lexamp/datapsych | 2023-08-20T10:02:12.000Z | [
"region:us"
] | lexamp | null | null | null | 0 | 0 | medical and therapy datasets combined |
approximatelylinear/transformer-functions | 2023-08-20T04:21:48.000Z | [
"region:us"
] | approximatelylinear | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: signature
dtype: string
- name: body
dtype: string
- name: filename
dtype: string
splits:
- name: train
num_bytes: 17030109
num_examples: 16212
download_size: 5010779
dataset_size: 17030109
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "transformer-functions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wywinter/gy | 2023-08-20T04:57:51.000Z | [
"region:us"
] | wywinter | null | null | null | 0 | 0 | Entry not found |
atmajagyt/sartimin | 2023-08-20T04:48:48.000Z | [
"region:us"
] | atmajagyt | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.