datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
AlekseyKorshuk/crowdsource-v2.0 | ---
dataset_info:
features:
- name: bot_id
dtype: string
- name: conversation_id
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
- name: bot_config
struct:
- name: bot_label
dtype: string
- name: description
dtype: string
- name: developer_uid
dtype: string
- name: first_message
dtype: string
- name: image_url
dtype: string
- name: introduction
dtype: string
- name: max_history
dtype: int64
- name: memory
dtype: string
- name: model
dtype: string
- name: name
dtype: string
- name: prompt
dtype: string
- name: repetition_penalty
dtype: float64
- name: response_length
dtype: int64
- name: temperature
dtype: float64
- name: theme
dtype: 'null'
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: user_label
dtype: string
- name: conversation_history
dtype: string
- name: system
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 106588734
num_examples: 19541
download_size: 65719430
dataset_size: 106588734
---
# Dataset Card for "crowdsource-v2.0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HangenYuu/gutenberg-english-train | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
splits:
- name: train
num_bytes: 21730024796
num_examples: 48284
download_size: 0
dataset_size: 21730024796
---
# Dataset Card for "gutenberg-english-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hfaus/CelebA_bbox_and_facepoints | ---
size_categories:
- n<1K
---
# CelebA Dataset
CelebA Dataset is a large-scale face attributes dataset with more than 200K celebrity images, each with 40 attribute annotations. The images in this dataset cover large pose variations and background clutter. CelebA has large diversities, large quantities, and rich annotations, including 10,177 number of identities, 202,599 number of face images, and 5 landmark locations, 40 binary attributes annotations per image.
## Usage
It is composed of 3 sets of images:
* Training
* Validation
* Test
## Example
The dataset returns each item as a dictionary with the following fields:
```
{
"image": image,
"bbox": [x1, y1, w, h],
"facial_landmarks": {
"lefteye": [x1, y1],
"righteye": [x2, y2],
"nose": [x3, y3],
"leftmouth": [x4, y4],
"rightmouth": [x5, y5]
}
}
```
## License
CelebA Dataset is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0). |
zicsx/indic-align-hindi | ---
dataset_info:
features:
- name: interactions
sequence:
sequence: string
- name: num_turns
dtype: int64
splits:
- name: train
num_bytes: 12940934287
num_examples: 13310858
download_size: 2105451389
dataset_size: 12940934287
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "indic-align-hindi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/bus_few4_50x | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 677514
num_examples: 3500
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 0
dataset_size: 755032
---
# Dataset Card for "bus_few4_50x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NX2411/mydataset-only-test | ---
license: apache-2.0
---
|
OzoneAsai/wikibooks | ---
license: wtfpl
---
|
nlpso/m1_qualitative_analysis_ref_ptrn_cmbert_iob2 | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m1_qualitative_analysis_ref_ptrn_cmbert_iob2
## Introduction
This dataset was used to perform **qualitative analysis** of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approach : M1
* Dataset type : ground-truth
* Tokenizer : [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained)
* Tagging format : IOB2
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* Level-1 : [nlpso/m1_ind_layers_ref_ptrn_cmbert_iob2_level_1](https://huggingface.co/nlpso/m1_ind_layers_ref_ptrn_cmbert_iob2_level_1)
* Level 2 : [nlpso/m1_ind_layers_ref_ptrn_cmbert_iob2_level_2](https://huggingface.co/nlpso/m1_ind_layers_ref_ptrn_cmbert_iob2_level_2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m1_qualitative_analysis_ref_ptrn_cmbert_iob2")
|
griffin/chain_of_density | ---
dataset_info:
- config_name: annotated
features:
- name: article
dtype: string
- name: highlights
dtype: string
- name: id
dtype: string
- name: prediction
sequence: string
- name: missing
sequence: string
- name: model
dtype: string
- name: annotations
sequence: int64
- name: num_tokens
sequence: int64
- name: num_entities
sequence: int64
- name: fusion
sequence: float64
- name: entity_density
sequence: float64
- name: inverse_lead_bias
sequence: float64
- name: extractive_density
sequence: float64
- name: extractive_coverage
sequence: float64
- name: unique_unigrams
sequence: float64
- name: unique_bigrams
sequence: float64
- name: unique_trigrams
sequence: float64
- name: rouge1
sequence: float64
- name: rouge2
sequence: float64
- name: rougeL
sequence: float64
- name: rougeLsum
sequence: float64
- name: gpt4_informative
sequence: float64
- name: gpt4_quality
sequence: float64
- name: gpt4_attributable
sequence: float64
- name: gpt4_coherence
sequence: float64
- name: gpt4_overall
sequence: float64
splits:
- name: test
num_bytes: 750471
num_examples: 100
download_size: 452599
dataset_size: 750471
- config_name: unannotated
features:
- name: article
dtype: string
- name: highlights
dtype: string
- name: id
dtype: string
- name: prediction
sequence: string
- name: missing
sequence: string
- name: model
dtype: string
- name: num_tokens
sequence: int64
- name: num_entities
sequence: int64
- name: fusion
sequence: float64
- name: entity_density
sequence: float64
- name: inverse_lead_bias
sequence: float64
- name: extractive_density
sequence: float64
- name: extractive_coverage
sequence: float64
- name: unique_unigrams
sequence: float64
- name: unique_bigrams
sequence: float64
- name: unique_trigrams
sequence: float64
- name: rouge1
sequence: float64
- name: rouge2
sequence: float64
- name: rougeL
sequence: float64
- name: rougeLsum
sequence: float64
splits:
- name: train
num_bytes: 6948744
num_examples: 1000
download_size: 3719092
dataset_size: 6948744
configs:
- config_name: annotated
data_files:
- split: test
path: annotated/test-*
- config_name: unannotated
data_files:
- split: train
path: unannotated/train-*
---
# Dataset Card for "chain_of_density"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anilguven/turkish_tweet_emotion_dataset | ---
license: unknown
task_categories:
- text-classification
language:
- tr
tags:
- tweet
- turkish
- sentiment
- emotion
size_categories:
- 1K<n<10K
---
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@INPROCEEDINGS{8946435,
author={Güven, Zekeriya Anıl and Diri, Banu and Çakaloğlu, Tolgahan},
booktitle={2019 Innovations in Intelligent Systems and Applications Conference (ASYU)},
title={Comparison Method for Emotion Detection of Twitter Users},
year={2019},
volume={},
number={},
pages={1-5},
keywords={Twitter;Resource management;Machine learning algorithms;Computer science;Media;Advertising;Topic Modelling;Latent Dirichlet Allocation;Natural Language Processing;Emotion Detection;Sentiment Analysis;Machine Learning},
doi={10.1109/ASYU48272.2019.8946435}}
**APA:**
Güven, Z. A., Diri, B., & Çakaloğlu, T. (2019, October). Comparison Method for Emotion Detection of Twitter Users. In 2019 Innovations in Intelligent Systems and Applications Conference (ASYU) (pp. 1-5). IEEE.
|
KayEe/flipkart_sentiment_analysis | ---
language:
- en
pretty_name: sa
configs:
- config_name: default
data_files:
- split: train
path: "train.json"
- split: test
path: "test.json"
default: true
--- |
AdapterOcean/med_alpaca_standardized_cluster_54_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12656152
num_examples: 21288
download_size: 6558110
dataset_size: 12656152
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_54_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Treza12/all_dataset | ---
license: apache-2.0
---
|
benayas/banking_llm_v4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 21973867
num_examples: 10003
- name: test
num_bytes: 6745410
num_examples: 3080
download_size: 2573335
dataset_size: 28719277
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
PNLPhub/C-ExaPPC | ---
license: mit
---
|
Isotonic/open-instruct-v1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 693502500.8465096
num_examples: 399050
- name: test
num_bytes: 173376494.1534904
num_examples: 99763
download_size: 369952246
dataset_size: 866878995.0
task_categories:
- text-generation
- conversational
language:
- en
size_categories:
- 100K<n<1M
---
# Dataset Card for "open-instruct-v1"
Open Instruct V1 is an amalgamation of different datasets which are cleaned and then collated into a singular format for training.
Uses Stability AI's System Prompt.
```
### System: StableLM Tuned (Alpha version)
- StableLM is a helpful and harmless open-source AI language model developed by StabilityAI.
- StableLM is excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- StableLM is more than just an information source, StableLM is also able to write poetry, short stories, and make jokes.
- StableLM will refuse to participate in anything that could harm a human.
```
## Dataset Breakdown
| Dataset | Amount of Samples |
|----------------|-------------------|
| [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) | 51759 |
| [Self Instruct](https://github.com/yizhongw/self-instruct) | 82599 |
| [GPT-4 Instruct](https://github.com/teknium1/GPTeacher) | 18194 |
| [Code Alpaca](https://huggingface.co/datasets/HuggingFaceH4/CodeAlpaca_20K) | 18019 |
| [Dolly](https://huggingface.co/datasets/HuggingFaceH4/databricks_dolly_15k) | 15015 |
| [Synthetic](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise) | 33143 |
| [Roleplay](https://github.com/teknium1/GPTeacher) | 3146 |
| [asss](https://huggingface.co/datasets/HuggingFaceH4/asss) | 448 |
| [instruction-dataset](https://huggingface.co/datasets/HuggingFaceH4/instruction-dataset) | 327 |
| [Human assistant deduped](https://huggingface.co/datasets/Isotonic/human_assistant_conversation_deduped) | 209350
| Total | 432000 | |
open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-13b-chat-hf | ---
pretty_name: Evaluation run of openthaigpt/openthaigpt-1.0.0-beta-13b-chat-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openthaigpt/openthaigpt-1.0.0-beta-13b-chat-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-13b-chat-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-13b-chat-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T13:17:24.378047](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-13b-chat-hf/blob/main/results_2023-12-29T13-17-24.378047.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5079906839832642,\n\
\ \"acc_stderr\": 0.03424315613001413,\n \"acc_norm\": 0.516456403503363,\n\
\ \"acc_norm_stderr\": 0.03513746029855274,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4416133249202012,\n\
\ \"mc2_stderr\": 0.01548425276508773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4974402730375427,\n \"acc_stderr\": 0.014611199329843784,\n\
\ \"acc_norm\": 0.5358361774744027,\n \"acc_norm_stderr\": 0.01457381366473572\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5986855208125871,\n\
\ \"acc_stderr\": 0.004891626718097025,\n \"acc_norm\": 0.7908783110934077,\n\
\ \"acc_norm_stderr\": 0.0040585031572305955\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112143,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.027869320571664632,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.027869320571664632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187897,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187897\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.032922966391551414,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.032922966391551414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n\
\ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5042016806722689,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.5042016806722689,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6642201834862386,\n \"acc_stderr\": 0.020248081396752923,\n \"\
acc_norm\": 0.6642201834862386,\n \"acc_norm_stderr\": 0.020248081396752923\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6470588235294118,\n \"acc_stderr\": 0.03354092437591519,\n \"\
acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03354092437591519\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990407,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990407\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.043389203057924,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.043389203057924\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.038818912133343826,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.038818912133343826\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.028120966503914407,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.028120966503914407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6896551724137931,\n\
\ \"acc_stderr\": 0.016543785026048304,\n \"acc_norm\": 0.6896551724137931,\n\
\ \"acc_norm_stderr\": 0.016543785026048304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3687150837988827,\n\
\ \"acc_stderr\": 0.016135759015030122,\n \"acc_norm\": 0.3687150837988827,\n\
\ \"acc_norm_stderr\": 0.016135759015030122\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891765,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891765\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347237,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347237\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3878748370273794,\n\
\ \"acc_stderr\": 0.012444998309675617,\n \"acc_norm\": 0.3878748370273794,\n\
\ \"acc_norm_stderr\": 0.012444998309675617\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872408,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872408\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4416133249202012,\n\
\ \"mc2_stderr\": 0.01548425276508773\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.01234691486341531\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \
\ \"acc_stderr\": 0.002504942226860527\n }\n}\n```"
repo_url: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-13b-chat-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|arc:challenge|25_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|arc:challenge|25_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|gsm8k|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|gsm8k|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hellaswag|10_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hellaswag|10_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-26-13.560554.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T13-17-24.378047.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T13-17-24.378047.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- '**/details_harness|winogrande|5_2023-12-27T14-26-13.560554.parquet'
- split: 2023_12_29T13_17_24.378047
path:
- '**/details_harness|winogrande|5_2023-12-29T13-17-24.378047.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T13-17-24.378047.parquet'
- config_name: results
data_files:
- split: 2023_12_27T14_26_13.560554
path:
- results_2023-12-27T14-26-13.560554.parquet
- split: 2023_12_29T13_17_24.378047
path:
- results_2023-12-29T13-17-24.378047.parquet
- split: latest
path:
- results_2023-12-29T13-17-24.378047.parquet
---
# Dataset Card for Evaluation run of openthaigpt/openthaigpt-1.0.0-beta-13b-chat-hf
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openthaigpt/openthaigpt-1.0.0-beta-13b-chat-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-beta-13b-chat-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-13b-chat-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T13:17:24.378047](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-beta-13b-chat-hf/blob/main/results_2023-12-29T13-17-24.378047.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5079906839832642,
"acc_stderr": 0.03424315613001413,
"acc_norm": 0.516456403503363,
"acc_norm_stderr": 0.03513746029855274,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4416133249202012,
"mc2_stderr": 0.01548425276508773
},
"harness|arc:challenge|25": {
"acc": 0.4974402730375427,
"acc_stderr": 0.014611199329843784,
"acc_norm": 0.5358361774744027,
"acc_norm_stderr": 0.01457381366473572
},
"harness|hellaswag|10": {
"acc": 0.5986855208125871,
"acc_stderr": 0.004891626718097025,
"acc_norm": 0.7908783110934077,
"acc_norm_stderr": 0.0040585031572305955
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.03070948699255655,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.03070948699255655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112143,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6,
"acc_stderr": 0.027869320571664632,
"acc_norm": 0.6,
"acc_norm_stderr": 0.027869320571664632
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6464646464646465,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.6464646464646465,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.032922966391551414,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.032922966391551414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.025329663163489943,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.025329663163489943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5042016806722689,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.5042016806722689,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6642201834862386,
"acc_stderr": 0.020248081396752923,
"acc_norm": 0.6642201834862386,
"acc_norm_stderr": 0.020248081396752923
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03354092437591519,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03354092437591519
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.030381931949990407,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.030381931949990407
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.043389203057924,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.043389203057924
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.028120966503914407,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.028120966503914407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.016543785026048304,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.016543785026048304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3687150837988827,
"acc_stderr": 0.016135759015030122,
"acc_norm": 0.3687150837988827,
"acc_norm_stderr": 0.016135759015030122
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891765,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891765
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347237,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347237
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3878748370273794,
"acc_stderr": 0.012444998309675617,
"acc_norm": 0.3878748370273794,
"acc_norm_stderr": 0.012444998309675617
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872408,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872408
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4416133249202012,
"mc2_stderr": 0.01548425276508773
},
"harness|winogrande|5": {
"acc": 0.7387529597474349,
"acc_stderr": 0.01234691486341531
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860527
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mboth/waermeErzeugen-100-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': BHKW
'1': Kessel
'2': Pelletkessel
'3': Waermepumpe
'4': WaermeversorgerAllgemein
splits:
- name: train
num_bytes: 64185.247706422015
num_examples: 359
- name: test
num_bytes: 38880
num_examples: 218
- name: valid
num_bytes: 38880
num_examples: 218
download_size: 61981
dataset_size: 141945.247706422
---
# Dataset Card for "waermeErzeugen-100-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PY007/slimpajama_llama_tokenized_upsample_4096_chunk_1M | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: labels
dtype: int64
- name: source
list:
- name: end
dtype: int64
- name: source
dtype: string
- name: start
dtype: int64
splits:
- name: train
num_bytes: 40394155659
num_examples: 5041
download_size: 9203916526
dataset_size: 40394155659
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Generated using https://github.com/FranxYao/Long-Context-Data-Engineering with the below command:
```bash
mkdir logs
mkdir data
mkdir data/slimpajama
mkdir data/slimpajama/per_source_downsample
cd data_engineering
PATH_TO_SLIMPAJAMA=rokset3/slim_pajama_chunk_1
nohup python -u slimpajama_packing.py\
--dataset_size=100m\
--print_interval=100 --num_process=200\
--chunk_size=1000001 \
--dataset_path=$PATH_TO_SLIMPAJAMA\
--output_path=../data/slimpajama/per_source_downsample/ --down_sample_ratio=0.1 --down_sample_mode=per_source\
> ../logs/slimpajama_packing_dist_per_source_downsample_0.1.log 2>&1 &
tail -f ../logs/slimpajama_packing_dist_per_source_downsample_0.1.log
``` |
tyzhu/squad_instruction_v1_train_100 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 177041.73335312048
num_examples: 100
- name: validation
num_bytes: 1888548.7582781457
num_examples: 1000
download_size: 1184787
dataset_size: 2065590.4916312662
---
# Dataset Card for "squad_instruction_v1_train_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anakib1/synth-rag | ---
dataset_info:
- config_name: MWP-ru
features:
- name: audio
dtype: audio
- name: theme
dtype: string
- name: transcription
dtype: string
- name: summary
dtype: string
- name: noise
dtype: string
splits:
- name: train
num_bytes: 34334807.0
num_examples: 20
download_size: 34328238
dataset_size: 34334807.0
- config_name: concept
features:
- name: audio
dtype: audio
- name: theme
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 7900550.0
num_examples: 5
download_size: 6952224
dataset_size: 7900550.0
- config_name: dummy
features:
- name: audio
dtype: audio
- name: theme
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 39684356.0
num_examples: 20
download_size: 38522196
dataset_size: 39684356.0
- config_name: working-example
features:
- name: audio
dtype: audio
- name: theme
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 104460945.0
num_examples: 51
download_size: 91278093
dataset_size: 104460945.0
configs:
- config_name: MWP-ru
data_files:
- split: train
path: MWP-ru/train-*
- config_name: concept
data_files:
- split: train
path: concept/train-*
- config_name: dummy
data_files:
- split: train
path: dummy/train-*
- config_name: working-example
data_files:
- split: train
path: working-example/train-*
---
|
Nadav/pixel_glue_mrpc_low_noise | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: validation
num_bytes: 14592360.0
num_examples: 408
download_size: 14571167
dataset_size: 14592360.0
---
# Dataset Card for "pixel_glue_mrpc_low_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/93_Hours_Korean_Children_Real_world_Casual_Conversation_and_Monologue_speech_dataset | ---
license: cc-by-nc-nd-4.0
---
## Description
Korean(Korea) Children Real-world Casual Conversation and Monologue speech dataset, covers self-media, conversation, live, lecture, variety show and other generic domains, mirrors real-world interactions. Transcribed with text content, speaker's ID, gender, age, accent and other attributes. Our dataset was collected from extensive and diversify speakers(12 years old and younger children), geographicly speaking, enhancing model performance in real and complex tasks.rnQuality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1329?source=Huggingface
## Format
16kHz, 16 bit, wav, mono channel
## Age
12 years old and younger children
## Content category
including interview, self-meida,variety show, etc.
## Recording environment
Low background noise
## Country
South Korea(KOR)
## Language(Region) Code
ko-KR
## Language
Korean
## Features of annotation
Transcription text, timestamp, speaker ID, gender, noise
## Accuracy
Word Accuracy Rate (WAR) 98%
# Licensing Information
Commercial License
|
nadsoft/Arabic-dialect-2-English | ---
dataset_info:
features:
- name: id
dtype: string
- name: Arabic
dtype: string
- name: English
dtype: string
splits:
- name: train
num_bytes: 15467913.665420653
num_examples: 16051
- name: test
num_bytes: 3867219.3345793462
num_examples: 4013
download_size: 9835505
dataset_size: 19335133.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b | ---
pretty_name: Evaluation run of uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b](https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T13:11:43.680043](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b/blob/main/results_2023-10-15T13-11-43.680043.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.057466442953020135,\n\
\ \"em_stderr\": 0.0023833905882384896,\n \"f1\": 0.17808829697986514,\n\
\ \"f1_stderr\": 0.002972308703760267,\n \"acc\": 0.44245449154575855,\n\
\ \"acc_stderr\": 0.010703432271512695\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.057466442953020135,\n \"em_stderr\": 0.0023833905882384896,\n\
\ \"f1\": 0.17808829697986514,\n \"f1_stderr\": 0.002972308703760267\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \
\ \"acc_stderr\": 0.009298499235587858\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437531\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|arc:challenge|25_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T13_11_43.680043
path:
- '**/details_harness|drop|3_2023-10-15T13-11-43.680043.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T13-11-43.680043.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T13_11_43.680043
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-11-43.680043.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-11-43.680043.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hellaswag|10_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T00:07:11.850382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T00:07:11.850382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T00:07:11.850382.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T13_11_43.680043
path:
- '**/details_harness|winogrande|5_2023-10-15T13-11-43.680043.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T13-11-43.680043.parquet'
- config_name: results
data_files:
- split: 2023_09_02T00_07_11.850382
path:
- results_2023-09-02T00:07:11.850382.parquet
- split: 2023_09_12T15_48_02.156025
path:
- results_2023-09-12T15-48-02.156025.parquet
- split: 2023_10_15T13_11_43.680043
path:
- results_2023-10-15T13-11-43.680043.parquet
- split: latest
path:
- results_2023-10-15T13-11-43.680043.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b](https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T13:11:43.680043](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b/blob/main/results_2023-10-15T13-11-43.680043.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.057466442953020135,
"em_stderr": 0.0023833905882384896,
"f1": 0.17808829697986514,
"f1_stderr": 0.002972308703760267,
"acc": 0.44245449154575855,
"acc_stderr": 0.010703432271512695
},
"harness|drop|3": {
"em": 0.057466442953020135,
"em_stderr": 0.0023833905882384896,
"f1": 0.17808829697986514,
"f1_stderr": 0.002972308703760267
},
"harness|gsm8k|5": {
"acc": 0.13115996967399546,
"acc_stderr": 0.009298499235587858
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437531
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_abacusai__Slerp-CM-mist-dpo | ---
pretty_name: Evaluation run of abacusai/Slerp-CM-mist-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abacusai/Slerp-CM-mist-dpo](https://huggingface.co/abacusai/Slerp-CM-mist-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Slerp-CM-mist-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:32:34.951153](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Slerp-CM-mist-dpo/blob/main/results_2024-01-05T00-32-34.951153.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532042094380237,\n\
\ \"acc_stderr\": 0.03203963454702555,\n \"acc_norm\": 0.6527369993047523,\n\
\ \"acc_norm_stderr\": 0.032706226155307466,\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6281840008276592,\n\
\ \"mc2_stderr\": 0.01521885509426602\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537302,\n\
\ \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778768\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6873132842063334,\n\
\ \"acc_stderr\": 0.004626404491616958,\n \"acc_norm\": 0.8709420434176459,\n\
\ \"acc_norm_stderr\": 0.0033457889052629563\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"\
acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6281840008276592,\n\
\ \"mc2_stderr\": 0.01521885509426602\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7278241091736164,\n \
\ \"acc_stderr\": 0.012259714035164545\n }\n}\n```"
repo_url: https://huggingface.co/abacusai/Slerp-CM-mist-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-32-34.951153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-32-34.951153.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- '**/details_harness|winogrande|5_2024-01-05T00-32-34.951153.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-32-34.951153.parquet'
- config_name: results
data_files:
- split: 2024_01_05T00_32_34.951153
path:
- results_2024-01-05T00-32-34.951153.parquet
- split: latest
path:
- results_2024-01-05T00-32-34.951153.parquet
---
# Dataset Card for Evaluation run of abacusai/Slerp-CM-mist-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/Slerp-CM-mist-dpo](https://huggingface.co/abacusai/Slerp-CM-mist-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__Slerp-CM-mist-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:32:34.951153](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Slerp-CM-mist-dpo/blob/main/results_2024-01-05T00-32-34.951153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532042094380237,
"acc_stderr": 0.03203963454702555,
"acc_norm": 0.6527369993047523,
"acc_norm_stderr": 0.032706226155307466,
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6281840008276592,
"mc2_stderr": 0.01521885509426602
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537302,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.013438909184778768
},
"harness|hellaswag|10": {
"acc": 0.6873132842063334,
"acc_stderr": 0.004626404491616958,
"acc_norm": 0.8709420434176459,
"acc_norm_stderr": 0.0033457889052629563
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653349,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6281840008276592,
"mc2_stderr": 0.01521885509426602
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.7278241091736164,
"acc_stderr": 0.012259714035164545
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
KyleLin/Parse-Then-Place | ---
license: ms-pl
---
This repository provides the datasets and checkpoints for the paper "A Parse-Then-Place Approach for Generating Graphic Layouts from Textual Descriptions".
Please see our [paper](https://arxiv.org/abs/2308.12700) and [code](https://github.com/microsoft/LayoutGeneration/).
|
autoevaluate/autoeval-eval-xglue-mlqa-02a2ef-48376145243 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xglue
eval_info:
task: summarization
model: google/roberta2roberta_L-24_bbc
metrics: ['bleu', 'f1', 'accuracy']
dataset_name: xglue
dataset_config: mlqa
dataset_split: test.ar
col_mapping:
text: context
target: question
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/roberta2roberta_L-24_bbc
* Dataset: xglue
* Config: mlqa
* Split: test.ar
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Anwaarma](https://huggingface.co/Anwaarma) for evaluating this model. |
emea | ---
annotations_creators:
- found
language_creators:
- found
language:
- bg
- cs
- da
- de
- el
- en
- es
- et
- fi
- fr
- hu
- it
- lt
- lv
- mt
- nl
- pl
- pt
- ro
- sk
- sl
- sv
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: EMEA
dataset_info:
- config_name: bg-el
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- bg
- el
splits:
- name: train
num_bytes: 296160562
num_examples: 1044065
download_size: 54531690
dataset_size: 296160562
- config_name: cs-et
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- cs
- et
splits:
- name: train
num_bytes: 180261167
num_examples: 1053164
download_size: 36065651
dataset_size: 180261167
- config_name: de-mt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- de
- mt
splits:
- name: train
num_bytes: 182976918
num_examples: 1000532
download_size: 36665427
dataset_size: 182976918
- config_name: fr-sk
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- fr
- sk
splits:
- name: train
num_bytes: 193605247
num_examples: 1062753
download_size: 38916074
dataset_size: 193605247
- config_name: es-lt
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- es
- lt
splits:
- name: train
num_bytes: 182623676
num_examples: 1051370
download_size: 35329033
dataset_size: 182623676
config_names:
- bg-el
- cs-et
- de-mt
- es-lt
- fr-sk
---
# Dataset Card for EMEA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://opus.nlpl.eu/EMEA.php
- **Repository:** None
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper.pdf
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
To load a language pair which isn't part of the config, all you need to do is specify the language code as pairs.
You can find the valid pairs in Homepage section of Dataset Description: http://opus.nlpl.eu/EMEA.php
E.g.
`dataset = load_dataset("emea", lang1="en", lang2="nl")`
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
Here is an example of the `en-nl` configuration:
```
{'id': '4',
'translation': {'en': 'EPAR summary for the public',
'nl': 'EPAR-samenvatting voor het publiek'}}
```
### Data Fields
The data fields are:
- id: id of the sentence pair
- translation: a dictionary of the form {lang1: text_in_lang1, lang2: text_in_lang2}
### Data Splits
Sizes of some language pairs:
| name |train|
|----------|----:|
|bg-el|1044065|
|cs-et|1053164|
|de-mt|1000532|
|fr-sk|1062753|
|es-lt|1051370|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@InProceedings{TIEDEMANN12.463,
author = {J{\"o}rg Tiedemann},
title = {Parallel Data, Tools and Interfaces in OPUS},
booktitle = {Proceedings of the Eight International Conference on Language Resources and Evaluation (LREC'12)},
year = {2012},
month = {may},
date = {23-25},
address = {Istanbul, Turkey},
editor = {Nicoletta Calzolari (Conference Chair) and Khalid Choukri and Thierry Declerck and Mehmet Ugur Dogan and Bente Maegaard and Joseph Mariani and Jan Odijk and Stelios Piperidis},
publisher = {European Language Resources Association (ELRA)},
isbn = {978-2-9517408-7-7},
language = {english}
}
```
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
kblw/ft-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 348235189.17
num_examples: 4730
download_size: 127651795
dataset_size: 348235189.17
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
update: use different shapes
|
Andyrasika/prompt-recommendation | ---
dataset_info:
features:
- name: id
dtype: int64
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 64111
num_examples: 100
- name: eval
num_bytes: 13427
num_examples: 21
download_size: 18652
dataset_size: 77538
---
# Dataset Card for "prompt-recommendation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713063700 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2287161
num_examples: 7097
download_size: 1288578
dataset_size: 2287161
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/data-standardized_cluster_19 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 43899388
num_examples: 4277
download_size: 12613003
dataset_size: 43899388
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_en_conf_llama_worstscore | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 81192
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_conf_llama_worstscore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sayan1101/test-krra | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 204
num_examples: 1
download_size: 2504
dataset_size: 204
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test-krra"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MylesChew/JAX_FACADE_240 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 3848813.0
num_examples: 214
- name: validation
num_bytes: 371632.0
num_examples: 24
download_size: 3438896
dataset_size: 4220445.0
---
# Dataset Card for "JAX_FACADE_240"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Shahzebbb/bookcorpus_tokenized | ---
dataset_info:
features:
- name: tokens
dtype: int64
splits:
- name: train
num_bytes: 8289993464
num_examples: 1036249183
download_size: 3290839503
dataset_size: 8289993464
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
niltheory/PoeticDevices | ---
license: cc-by-sa-4.0
language:
- en
tags:
- creative writing
- poetry
- poetic devices
pretty_name: Poetic Devices
size_categories:
- n<1K
--- |
katielink/moleculenet-benchmark | ---
license: apache-2.0
tags:
- biology
- chemistry
configs:
- config_name: bace
data_files:
- split: train
path: bace/train.csv
- split: test
path: bace/test.csv
- split: val
path: bace/valid.csv
- config_name: bbbp
data_files:
- split: train
path: bbbp/train.csv
- split: test
path: bbbp/test.csv
- split: val
path: bbbp/valid.csv
- config_name: clintox
data_files:
- split: train
path: clintox/train.csv
- split: test
path: clintox/test.csv
- split: val
path: clintox/valid.csv
- config_name: esol
data_files:
- split: train
path: esol/train.csv
- split: test
path: esol/test.csv
- split: val
path: esol/valid.csv
- config_name: freesolv
data_files:
- split: train
path: freesolv/train.csv
- split: test
path: freesolv/test.csv
- split: val
path: freesolv/valid.csv
- config_name: hiv
data_files:
- split: train
path: hiv/train.csv
- split: test
path: hiv/test.csv
- split: val
path: hiv/valid.csv
- config_name: lipo
data_files:
- split: train
path: lipo/train.csv
- split: test
path: lipo/test.csv
- split: val
path: lipo/valid.csv
- config_name: qm9
data_files:
- split: train
path: qm9/train.csv
- split: test
path: qm9/test.csv
- split: val
path: qm9/valid.csv
- config_name: sider
data_files:
- split: train
path: sider/train.csv
- split: test
path: sider/test.csv
- split: val
path: sider/valid.csv
- config_name: tox21
data_files:
- split: train
path: tox21/train.csv
- split: test
path: tox21/test.csv
- split: val
path: tox21/valid.csv
---
# MoleculeNet Benchmark ([website](https://moleculenet.org/))
MoleculeNet is a benchmark specially designed for testing machine learning methods of molecular properties. As we aim to facilitate the development of molecular machine learning method, this work curates a number of dataset collections, creates a suite of software that implements many known featurizations and previously proposed algorithms. All methods and datasets are integrated as parts of the open source DeepChem package(MIT license).
MoleculeNet is built upon multiple public databases. The full collection currently includes over 700,000 compounds tested on a range of different properties. We test the performances of various machine learning models with different featurizations on the datasets(detailed descriptions here), with all results reported in AUC-ROC, AUC-PRC, RMSE and MAE scores.
For users, please cite:
Zhenqin Wu, Bharath Ramsundar, Evan N. Feinberg, Joseph Gomes, Caleb Geniesse, Aneesh S. Pappu, Karl Leswing, Vijay Pande, MoleculeNet: A Benchmark for Molecular Machine Learning, arXiv preprint, arXiv: 1703.00564, 2017.
|
ArmelR/sharded-pile | ---
configs:
- config_name: all
data_files:
- split: train
path:
- data/ArXiv/train/*.parquet
- data/BookCorpus2/train/*.parquet
- data/Books3/train/*.arrow
- data/DM Mathematics/train/*.parquet
- data/Enron Emails/train/*.parquet
- data/EuroParl/train/*.parquet
- data/FreeLaw/train/*.parquet
- data/Github/train/*.parquet
- data/Gutenberg (PG-19)/train/*.parquet
- data/HackerNews/train/*.parquet
- data/NIH ExPorter/train/*.parquet
- data/OpenSubtitles/train/*.parquet
- data/OpenWebText2/train/*.parquet
- data/PhilPapers/train/*.parquet
- data/Pile-CC/train/*.parquet
- data/PubMed Abstracts/train/*.parquet
- data/PubMed Central/train/*.parquet
- data/StackExchange/train/*.parquet
- data/UPSTO Backgrounds/train/*.parquet
- data/Ubuntu IRC/train/*.parquet
- data/Wikipedia (en)/train/*.parquet
- data/YoutubeSubtitles/train/*.parquet
default : true
--- |
llm-lens/lens_sample_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': abyssinian
'1': american bulldog
'2': american pit bull terrier
'3': basset hound
'4': beagle
'5': bengal
'6': birman
'7': bombay
'8': boxer
'9': british shorthair
'10': chihuahua
'11': egyptian mau
'12': english cocker spaniel
'13': english setter
'14': german shorthaired
'15': great pyrenees
'16': havanese
'17': japanese chin
'18': keeshond
'19': leonberger
'20': maine coon
'21': miniature pinscher
'22': newfoundland
'23': persian
'24': pomeranian
'25': pug
'26': ragdoll
'27': russian blue
'28': saint bernard
'29': samoyed
'30': scottish terrier
'31': shiba inu
'32': siamese
'33': sphynx
'34': staffordshire bull terrier
'35': wheaten terrier
'36': yorkshire terrier
- name: id
dtype: int64
- name: tags_laion-ViT-H-14-2B
sequence: string
- name: attributes_laion-ViT-H-14-2B
sequence: string
- name: caption_Salesforce-blip-image-captioning-large
dtype: string
- name: intensive_captions_Salesforce-blip-image-captioning-large
sequence: string
splits:
- name: test
num_bytes: 183543.0
num_examples: 10
download_size: 162581
dataset_size: 183543.0
---
# Dataset Card for "lens_sample_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
msaad02/gpt-3.5-data-qa | ---
dataset_info:
features:
- name: url
dtype: string
- name: data
dtype: string
- name: api_res
dtype: string
- name: questions
dtype: string
splits:
- name: train
num_bytes: 21127749
num_examples: 2683
download_size: 8837914
dataset_size: 21127749
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tidrael/test2 | ---
annotations_creators: []
language:
- en
language_creators:
- machine-generated
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: bussiness-news
size_categories:
- 1K<n<10K
source_datasets:
- original
tags: []
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Top news headline in finance from bbc-news
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
Sentiment label: Using threshold from -2% to 2% for neutral (2), below is negative (1) and above is positive (3)
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
VishalMysore/chankyaNeet2 | ---
license: apache-2.0
---
|
orendar/ultrafeedback_binarized_filtered | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: score_diff
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 162012105.5442132
num_examples: 27043
- name: test
num_bytes: 1198181.4557868077
num_examples: 200
download_size: 90942425
dataset_size: 163210287.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "ultrafeedback_binarized_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_indefinite_for_zero | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 475331
num_examples: 1683
- name: train
num_bytes: 1012397
num_examples: 3580
- name: validation
num_bytes: 114567
num_examples: 401
download_size: 1050270
dataset_size: 1602295
---
# Dataset Card for "MULTI_VALUE_mrpc_indefinite_for_zero"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jackoon/JSON_expert_huy | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 178537
num_examples: 173
download_size: 40306
dataset_size: 178537
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "JSON_expert_huy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bibek1129/nepali_SQuAD_single_qsn | ---
license: cc-by-4.0
---
|
ethz-spylab/hh-harmless-train-with-rewards | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: chosen_reward
dtype: float32
- name: rejected_reward
dtype: float32
- name: correct
dtype: bool
- name: difference
dtype: float32
splits:
- name: train
num_bytes: 56811404
num_examples: 42537
download_size: 32032368
dataset_size: 56811404
---
This dataset is an instance from the `harmless-base` split from the [Anthropic/hh-rlhf dataset](https://huggingface.co/datasets/Anthropic/hh-rlhf). All entries have been assigned a reward with our [custom reward model](https://huggingface.co/ethz-spylab/reward_model).
This allows us to identify the most harmful generations and use them to poison models using our oracle attack presented in our paper "[Universal Jailbreak Backdoors from Poisoned Human Feedback](https://arxiv.org/abs/2311.14455)" |
tasksource/regset | ---
license: unknown
---
|
CyberHarem/futaba_rio_seishunbutayarou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Futaba Rio
This is the dataset of Futaba Rio, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 448 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 448 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 448 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 448 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
BangumiBase/kumakumakumabear | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Kuma Kuma Kuma Bear
This is the image base of bangumi Kuma Kuma Kuma Bear, we detected 99 characters, 6688 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 801 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 135 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 55 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 78 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 22 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 45 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 26 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 17 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 40 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 47 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 25 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 24 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 14 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 21 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 16 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 19 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 128 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 22 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 58 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 12 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 180 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 15 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 14 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 49 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 13 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 60 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 15 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 21 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 103 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 16 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 12 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 35 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 8 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 14 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 15 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 10 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 16 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 33 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 17 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 70 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 10 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 26 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 1939 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 105 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 22 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 36 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 38 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 8 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 7 | [Download](49/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 50 | 69 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 66 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 7 | [Download](52/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 53 | 22 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 7 | [Download](54/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 55 | 14 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 197 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 52 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 8 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 29 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 62 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 26 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 69 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 30 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 11 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 55 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 15 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 204 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 283 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 26 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 40 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 17 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 8 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 13 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 18 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 16 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 8 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 10 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 51 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 135 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 62 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 14 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 48 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 15 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 14 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 38 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 6 | [Download](86/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 87 | 14 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 8 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 9 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 6 | [Download](90/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 91 | 5 | [Download](91/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 92 | 38 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 29 | [Download](93/dataset.zip) |  |  |  |  |  |  |  |  |
| 94 | 7 | [Download](94/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 95 | 17 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 24 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 11 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 223 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
tyzhu/find_sent_before_sent_train_400_eval_40_random_permute_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 7719631.487403426
num_examples: 5514
- name: validation
num_bytes: 232610
num_examples: 200
download_size: 1303658
dataset_size: 7952241.487403426
---
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/OxfordFlowers_test_facebook_opt_350m_Visclues_ns_6149_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 270234974.375
num_examples: 6149
- name: fewshot_3_bs_16
num_bytes: 274952857.375
num_examples: 6149
download_size: 534121686
dataset_size: 545187831.75
---
# Dataset Card for "OxfordFlowers_test_facebook_opt_350m_Visclues_ns_6149_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
it-at-m/LHM-Dienstleistungen-QA | ---
license: mit
language:
- de
tags:
- QA
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: test
num_bytes: 560403
num_examples: 357
- name: train
num_bytes: 2826731
num_examples: 1773
download_size: 710027
dataset_size: 3387134
task_categories:
- question-answering
pretty_name: 'LHM Dienstleistungen: QA'
size_categories:
- 1K<n<10K
---
# LHM-Dienstleistungen-QA - german public domain question-answering dataset
Datasets created based on data from Munich city administration.
Format inspired by GermanQuAD.
## Annotated by:
- Institute for Applied Artificial Intelligence: Leon Marius Schröder
- BettercallPaul GmbH: Clemens Gutknecht, Oubada Alkiddeh, Susanne Weiß
- Stadt München: Leon Lukas
## Data basis
Texts taken from the “Dienstleistungsfinder“ of the city of Munich administration.
There information about services offered by city is presented online.
Information ranges from applying for an ID card to dispose of garbage.
- https://stadt.muenchen.de/service/ (Date 11/2022)
## Dataset statistics
- Shortest Question: 13 Characters
- Average Question: 68 Characters
- Longest Question: 183 Characters
### Distribution of first sentence beginnings

### Distribution of first sentence beginnings: Wie

### Distribution of first sentence beginnings: Wo

### Distribution of first sentence beginnings: Was

## Models trained using this datset
### QA
- cgutknecht/gelectra_large_gsqd-gq-LHM
### DPR
- schreon/xnext-lhm_queries_encoder
- schreon/xnext-lhm_passages_encoder |
darcksky/Ringsofsaturnlugalkien | ---
license: artistic-2.0
---
|
open-llm-leaderboard/details_bit-dny__MindLLM | ---
pretty_name: Evaluation run of bit-dny/MindLLM
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bit-dny/MindLLM](https://huggingface.co/bit-dny/MindLLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bit-dny__MindLLM\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-27T12:33:52.223530](https://huggingface.co/datasets/open-llm-leaderboard/details_bit-dny__MindLLM/blob/main/results_2023-12-27T12-33-52.223530.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2547315459012399,\n\
\ \"acc_stderr\": 0.030757121924893716,\n \"acc_norm\": 0.2559855532831359,\n\
\ \"acc_norm_stderr\": 0.03153175700940631,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.43479871223663846,\n\
\ \"mc2_stderr\": 0.015180815930542027\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19539249146757678,\n \"acc_stderr\": 0.011586907189952911,\n\
\ \"acc_norm\": 0.22440273037542663,\n \"acc_norm_stderr\": 0.012191404938603838\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.30392352121091415,\n\
\ \"acc_stderr\": 0.004590100050198833,\n \"acc_norm\": 0.34106751643098987,\n\
\ \"acc_norm_stderr\": 0.004730991357194287\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.0327900040631005,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.0327900040631005\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029471,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029471\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.02951319662553935,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.02951319662553935\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287394,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.23870967741935484,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2828282828282828,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.2828282828282828,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.034474782864143586,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.034474782864143586\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2846153846153846,\n \"acc_stderr\": 0.0228783227997063,\n \
\ \"acc_norm\": 0.2846153846153846,\n \"acc_norm_stderr\": 0.0228783227997063\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.02788682807838057,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.02788682807838057\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3100917431192661,\n\
\ \"acc_stderr\": 0.019830849684439742,\n \"acc_norm\": 0.3100917431192661,\n\
\ \"acc_norm_stderr\": 0.019830849684439742\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17040358744394618,\n\
\ \"acc_stderr\": 0.025234593447136165,\n \"acc_norm\": 0.17040358744394618,\n\
\ \"acc_norm_stderr\": 0.025234593447136165\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.19834710743801653,\n \"acc_stderr\": 0.03640118271990945,\n \"\
acc_norm\": 0.19834710743801653,\n \"acc_norm_stderr\": 0.03640118271990945\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.028286324075564407,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.028286324075564407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n\
\ \"acc_stderr\": 0.015745497169049053,\n \"acc_norm\": 0.26309067688378035,\n\
\ \"acc_norm_stderr\": 0.015745497169049053\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.0218552552634218,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.0218552552634218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542611,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542611\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02313237623454334,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02313237623454334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n\
\ \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.01812022425148458,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.01812022425148458\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015025,\n \"mc2\": 0.43479871223663846,\n\
\ \"mc2_stderr\": 0.015180815930542027\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.49329123914759276,\n \"acc_stderr\": 0.014051220692330346\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \
\ \"acc_stderr\": 0.002504942226860537\n }\n}\n```"
repo_url: https://huggingface.co/bit-dny/MindLLM
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|arc:challenge|25_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|gsm8k|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hellaswag|10_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-33-52.223530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T12-33-52.223530.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- '**/details_harness|winogrande|5_2023-12-27T12-33-52.223530.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-27T12-33-52.223530.parquet'
- config_name: results
data_files:
- split: 2023_12_27T12_33_52.223530
path:
- results_2023-12-27T12-33-52.223530.parquet
- split: latest
path:
- results_2023-12-27T12-33-52.223530.parquet
---
# Dataset Card for Evaluation run of bit-dny/MindLLM
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bit-dny/MindLLM](https://huggingface.co/bit-dny/MindLLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bit-dny__MindLLM",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-27T12:33:52.223530](https://huggingface.co/datasets/open-llm-leaderboard/details_bit-dny__MindLLM/blob/main/results_2023-12-27T12-33-52.223530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2547315459012399,
"acc_stderr": 0.030757121924893716,
"acc_norm": 0.2559855532831359,
"acc_norm_stderr": 0.03153175700940631,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.43479871223663846,
"mc2_stderr": 0.015180815930542027
},
"harness|arc:challenge|25": {
"acc": 0.19539249146757678,
"acc_stderr": 0.011586907189952911,
"acc_norm": 0.22440273037542663,
"acc_norm_stderr": 0.012191404938603838
},
"harness|hellaswag|10": {
"acc": 0.30392352121091415,
"acc_stderr": 0.004590100050198833,
"acc_norm": 0.34106751643098987,
"acc_norm_stderr": 0.004730991357194287
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029471,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029471
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.02951319662553935,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.02951319662553935
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287394,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.034474782864143586,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.034474782864143586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2846153846153846,
"acc_stderr": 0.0228783227997063,
"acc_norm": 0.2846153846153846,
"acc_norm_stderr": 0.0228783227997063
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.02788682807838057,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.02788682807838057
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3100917431192661,
"acc_stderr": 0.019830849684439742,
"acc_norm": 0.3100917431192661,
"acc_norm_stderr": 0.019830849684439742
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17040358744394618,
"acc_stderr": 0.025234593447136165,
"acc_norm": 0.17040358744394618,
"acc_norm_stderr": 0.025234593447136165
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19834710743801653,
"acc_stderr": 0.03640118271990945,
"acc_norm": 0.19834710743801653,
"acc_norm_stderr": 0.03640118271990945
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564407,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049053,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049053
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542611,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542611
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02313237623454334,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02313237623454334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.01812022425148458,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.01812022425148458
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015025,
"mc2": 0.43479871223663846,
"mc2_stderr": 0.015180815930542027
},
"harness|winogrande|5": {
"acc": 0.49329123914759276,
"acc_stderr": 0.014051220692330346
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860537
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lei1211/simple_show | ---
license: apache-2.0
---
|
roa7n/patched_test_p_40_f_membrane_m1_predictions | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: m1_preds
dtype: float32
splits:
- name: train
num_bytes: 1959469283
num_examples: 3134581
download_size: 165870843
dataset_size: 1959469283
---
# Dataset Card for "patched_test_p_40_f_membrane_m1_predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python-code-instructions-18k-alpaca-standardized_cluster_9 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 27114194
num_examples: 3146
download_size: 6998532
dataset_size: 27114194
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python-code-instructions-18k-alpaca-standardized_cluster_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mangoo111/stt_datasets_mixed | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6146981696
num_examples: 6400
- name: test
num_bytes: 768372824
num_examples: 800
- name: valid
num_bytes: 768373560
num_examples: 800
download_size: 869391895
dataset_size: 7683728080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
Adeeb-qu/New-grooul | ---
license: openrail
---
|
open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-instruct | ---
pretty_name: Evaluation run of mediocredev/open-llama-3b-v2-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mediocredev/open-llama-3b-v2-instruct](https://huggingface.co/mediocredev/open-llama-3b-v2-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T15:28:20.399841](https://huggingface.co/datasets/open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-instruct/blob/main/results_2023-12-16T15-28-20.399841.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3958981300034306,\n\
\ \"acc_stderr\": 0.034198998112262805,\n \"acc_norm\": 0.4018856544108267,\n\
\ \"acc_norm_stderr\": 0.035129135992579406,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023498,\n \"mc2\": 0.3795634078796446,\n\
\ \"mc2_stderr\": 0.014273839655133331\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35409556313993173,\n \"acc_stderr\": 0.01397545412275655,\n\
\ \"acc_norm\": 0.3848122866894198,\n \"acc_norm_stderr\": 0.014218371065251104\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5142401911969727,\n\
\ \"acc_stderr\": 0.0049877573147698445,\n \"acc_norm\": 0.7024497112129058,\n\
\ \"acc_norm_stderr\": 0.004562462665505218\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.03053333843046751,\n\
\ \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.03053333843046751\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.0314895582974553,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.0314895582974553\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3931034482758621,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.3931034482758621,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.041049472699033945,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.041049472699033945\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4161290322580645,\n \"acc_stderr\": 0.028040981380761547,\n \"\
acc_norm\": 0.4161290322580645,\n \"acc_norm_stderr\": 0.028040981380761547\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n \"\
acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4484848484848485,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.4484848484848485,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.48186528497409326,\n \"acc_stderr\": 0.03606065001832919,\n\
\ \"acc_norm\": 0.48186528497409326,\n \"acc_norm_stderr\": 0.03606065001832919\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902796,\n\
\ \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4972477064220184,\n \"acc_stderr\": 0.02143699835976532,\n \"\
acc_norm\": 0.4972477064220184,\n \"acc_norm_stderr\": 0.02143699835976532\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402544,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402544\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.46078431372549017,\n \"acc_stderr\": 0.03498501649369527,\n \"\
acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.47107438016528924,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.47107438016528924,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5341880341880342,\n\
\ \"acc_stderr\": 0.03267942734081228,\n \"acc_norm\": 0.5341880341880342,\n\
\ \"acc_norm_stderr\": 0.03267942734081228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5261813537675607,\n\
\ \"acc_stderr\": 0.01785543455404199,\n \"acc_norm\": 0.5261813537675607,\n\
\ \"acc_norm_stderr\": 0.01785543455404199\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.02663653974111608,\n\
\ \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.02663653974111608\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553984,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553984\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.028332397483664274,\n\
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.028332397483664274\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40192926045016075,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.40192926045016075,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.41358024691358025,\n \"acc_stderr\": 0.027402042040269952,\n\
\ \"acc_norm\": 0.41358024691358025,\n \"acc_norm_stderr\": 0.027402042040269952\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30964797913950454,\n\
\ \"acc_stderr\": 0.01180859826250332,\n \"acc_norm\": 0.30964797913950454,\n\
\ \"acc_norm_stderr\": 0.01180859826250332\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.029465133639776132,\n\
\ \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.029465133639776132\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.36437908496732024,\n \"acc_stderr\": 0.019469518221573702,\n \
\ \"acc_norm\": 0.36437908496732024,\n \"acc_norm_stderr\": 0.019469518221573702\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.32653061224489793,\n \"acc_stderr\": 0.030021056238440286,\n\
\ \"acc_norm\": 0.32653061224489793,\n \"acc_norm_stderr\": 0.030021056238440286\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4527363184079602,\n\
\ \"acc_stderr\": 0.035197027175769155,\n \"acc_norm\": 0.4527363184079602,\n\
\ \"acc_norm_stderr\": 0.035197027175769155\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49707602339181284,\n \"acc_stderr\": 0.03834759370936839,\n\
\ \"acc_norm\": 0.49707602339181284,\n \"acc_norm_stderr\": 0.03834759370936839\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023498,\n \"mc2\": 0.3795634078796446,\n\
\ \"mc2_stderr\": 0.014273839655133331\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6574585635359116,\n \"acc_stderr\": 0.013337483579075923\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/mediocredev/open-llama-3b-v2-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|arc:challenge|25_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|gsm8k|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hellaswag|10_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-28-20.399841.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T15-28-20.399841.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- '**/details_harness|winogrande|5_2023-12-16T15-28-20.399841.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T15-28-20.399841.parquet'
- config_name: results
data_files:
- split: 2023_12_16T15_28_20.399841
path:
- results_2023-12-16T15-28-20.399841.parquet
- split: latest
path:
- results_2023-12-16T15-28-20.399841.parquet
---
# Dataset Card for Evaluation run of mediocredev/open-llama-3b-v2-instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mediocredev/open-llama-3b-v2-instruct](https://huggingface.co/mediocredev/open-llama-3b-v2-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T15:28:20.399841](https://huggingface.co/datasets/open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-instruct/blob/main/results_2023-12-16T15-28-20.399841.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3958981300034306,
"acc_stderr": 0.034198998112262805,
"acc_norm": 0.4018856544108267,
"acc_norm_stderr": 0.035129135992579406,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023498,
"mc2": 0.3795634078796446,
"mc2_stderr": 0.014273839655133331
},
"harness|arc:challenge|25": {
"acc": 0.35409556313993173,
"acc_stderr": 0.01397545412275655,
"acc_norm": 0.3848122866894198,
"acc_norm_stderr": 0.014218371065251104
},
"harness|hellaswag|10": {
"acc": 0.5142401911969727,
"acc_stderr": 0.0049877573147698445,
"acc_norm": 0.7024497112129058,
"acc_norm_stderr": 0.004562462665505218
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.03053333843046751,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.03053333843046751
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3931034482758621,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.3931034482758621,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.041049472699033945,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.041049472699033945
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4161290322580645,
"acc_stderr": 0.028040981380761547,
"acc_norm": 0.4161290322580645,
"acc_norm_stderr": 0.028040981380761547
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4484848484848485,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.4484848484848485,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.48186528497409326,
"acc_stderr": 0.03606065001832919,
"acc_norm": 0.48186528497409326,
"acc_norm_stderr": 0.03606065001832919
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4972477064220184,
"acc_stderr": 0.02143699835976532,
"acc_norm": 0.4972477064220184,
"acc_norm_stderr": 0.02143699835976532
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402544,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402544
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.47107438016528924,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.47107438016528924,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5341880341880342,
"acc_stderr": 0.03267942734081228,
"acc_norm": 0.5341880341880342,
"acc_norm_stderr": 0.03267942734081228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5261813537675607,
"acc_stderr": 0.01785543455404199,
"acc_norm": 0.5261813537675607,
"acc_norm_stderr": 0.01785543455404199
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.02663653974111608,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.02663653974111608
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553984,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553984
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40192926045016075,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.40192926045016075,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.41358024691358025,
"acc_stderr": 0.027402042040269952,
"acc_norm": 0.41358024691358025,
"acc_norm_stderr": 0.027402042040269952
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30964797913950454,
"acc_stderr": 0.01180859826250332,
"acc_norm": 0.30964797913950454,
"acc_norm_stderr": 0.01180859826250332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.029465133639776132,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.029465133639776132
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36437908496732024,
"acc_stderr": 0.019469518221573702,
"acc_norm": 0.36437908496732024,
"acc_norm_stderr": 0.019469518221573702
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.32653061224489793,
"acc_stderr": 0.030021056238440286,
"acc_norm": 0.32653061224489793,
"acc_norm_stderr": 0.030021056238440286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4527363184079602,
"acc_stderr": 0.035197027175769155,
"acc_norm": 0.4527363184079602,
"acc_norm_stderr": 0.035197027175769155
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49707602339181284,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.49707602339181284,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023498,
"mc2": 0.3795634078796446,
"mc2_stderr": 0.014273839655133331
},
"harness|winogrande|5": {
"acc": 0.6574585635359116,
"acc_stderr": 0.013337483579075923
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ohsuz/DACON_RAG | ---
dataset_info:
features:
- name: id
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7091959
num_examples: 2576
download_size: 2570364
dataset_size: 7091959
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lewtun/top_quark_tagging_old | ---
license: cc-by-4.0
---
|
MeetX/mental-health-dataset-mistral7b-auto-tune | ---
dataset_info:
features:
- name: text
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 375246
num_examples: 172
download_size: 209156
dataset_size: 375246
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hotfinda/smithsonian_butterflies_subset | ---
dataset_info:
features:
- name: image_url
dtype: string
- name: image_alt
dtype: string
- name: id
dtype: string
- name: name
dtype: string
- name: scientific_name
dtype: string
- name: gender
dtype: string
- name: taxonomy
dtype: string
- name: region
dtype: string
- name: locality
dtype: string
- name: date
dtype: string
- name: usnm_no
dtype: string
- name: guid
dtype: string
- name: edan_url
dtype: string
- name: source
dtype: string
- name: stage
dtype: float64
- name: image
dtype: image
- name: image_hash
dtype: string
- name: sim_score
dtype: float64
splits:
- name: train
num_bytes: 237753960.0
num_examples: 1000
download_size: 237446351
dataset_size: 237753960.0
---
# Dataset Card for "smithsonian_butterflies_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yunconglong__MoE_13B_DPO | ---
pretty_name: Evaluation run of yunconglong/MoE_13B_DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yunconglong/MoE_13B_DPO](https://huggingface.co/yunconglong/MoE_13B_DPO) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__MoE_13B_DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T08:55:06.256687](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__MoE_13B_DPO/blob/main/results_2024-01-28T08-55-06.256687.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6516933230862831,\n\
\ \"acc_stderr\": 0.03214433470973161,\n \"acc_norm\": 0.6507001593260299,\n\
\ \"acc_norm_stderr\": 0.03283113819359505,\n \"mc1\": 0.6364749082007344,\n\
\ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7846972943990677,\n\
\ \"mc2_stderr\": 0.013799810152287217\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7201365187713311,\n \"acc_stderr\": 0.013119040897725923,\n\
\ \"acc_norm\": 0.7431740614334471,\n \"acc_norm_stderr\": 0.0127669237941168\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7226648078072098,\n\
\ \"acc_stderr\": 0.0044676841327724115,\n \"acc_norm\": 0.8939454291973711,\n\
\ \"acc_norm_stderr\": 0.0030727817579111268\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.0189754279205072,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.0189754279205072\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6364749082007344,\n\
\ \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.7846972943990677,\n\
\ \"mc2_stderr\": 0.013799810152287217\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8800315706393055,\n \"acc_stderr\": 0.009131996995678647\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6762699014404853,\n \
\ \"acc_stderr\": 0.012888247397371141\n }\n}\n```"
repo_url: https://huggingface.co/yunconglong/MoE_13B_DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|arc:challenge|25_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|gsm8k|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hellaswag|10_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T08-55-06.256687.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T08-55-06.256687.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- '**/details_harness|winogrande|5_2024-01-28T08-55-06.256687.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T08-55-06.256687.parquet'
- config_name: results
data_files:
- split: 2024_01_28T08_55_06.256687
path:
- results_2024-01-28T08-55-06.256687.parquet
- split: latest
path:
- results_2024-01-28T08-55-06.256687.parquet
---
# Dataset Card for Evaluation run of yunconglong/MoE_13B_DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/MoE_13B_DPO](https://huggingface.co/yunconglong/MoE_13B_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__MoE_13B_DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T08:55:06.256687](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__MoE_13B_DPO/blob/main/results_2024-01-28T08-55-06.256687.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6516933230862831,
"acc_stderr": 0.03214433470973161,
"acc_norm": 0.6507001593260299,
"acc_norm_stderr": 0.03283113819359505,
"mc1": 0.6364749082007344,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.7846972943990677,
"mc2_stderr": 0.013799810152287217
},
"harness|arc:challenge|25": {
"acc": 0.7201365187713311,
"acc_stderr": 0.013119040897725923,
"acc_norm": 0.7431740614334471,
"acc_norm_stderr": 0.0127669237941168
},
"harness|hellaswag|10": {
"acc": 0.7226648078072098,
"acc_stderr": 0.0044676841327724115,
"acc_norm": 0.8939454291973711,
"acc_norm_stderr": 0.0030727817579111268
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.0189754279205072,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.0189754279205072
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6364749082007344,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.7846972943990677,
"mc2_stderr": 0.013799810152287217
},
"harness|winogrande|5": {
"acc": 0.8800315706393055,
"acc_stderr": 0.009131996995678647
},
"harness|gsm8k|5": {
"acc": 0.6762699014404853,
"acc_stderr": 0.012888247397371141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ChoiDongHo/HuggingfaceTest | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nthngdy/babylm_10M | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 55441912.303940535
num_examples: 1015494
download_size: 36288832
dataset_size: 55441912.303940535
---
# Dataset Card for "babylm_10M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
graphs-datasets/IMDB-BINARY | ---
license: unknown
task_categories:
- graph-ml
---
# Dataset Card for IMDB-BINARY (IMDb-B)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Homepage](https://dl.acm.org/doi/10.1145/2783258.2783417)**
- **[Repository](https://www.chrsmrrs.com/graphkerneldatasets/IMDB-BINARY.zip):**:
- **Paper:**: Deep Graph Kernels (see citation)
- **Leaderboard:**: [Papers with code leaderboard](https://paperswithcode.com/sota/graph-classification-on-imdb-b)
### Dataset Summary
The `IMDb-B` dataset is "a movie collaboration dataset that consists of the ego-networks of 1,000 actors/actresses who played roles in movies in IMDB. In each graph, nodes represent actors/actress, and there is an edge between them if they appear in the same movie. These graphs are derived from the Action and Romance genres".
### Supported Tasks and Leaderboards
`IMDb-B` should be used for graph classification (aiming to predict whether a movie graph is an action or romance movie), a binary classification task. The score used is accuracy, using a 10-fold cross-validation.
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
dataset_hf = load_dataset("graphs-datasets/<mydataset>")
# For the train set (replace by valid or test as needed)
dataset_pg_list = [Data(graph) for graph in dataset_hf["train"]]
dataset_pg = DataLoader(dataset_pg_list)
```
## Dataset Structure
### Data Properties
| property | value |
|---|---|
| scale | medium |
| #graphs | 1000 |
| average #nodes | 19.79 |
| average #edges | 193.25 |
### Data Fields
Each row of a given file is a graph, with:
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `y` (list: 1 x #labels): contains the number of labels available to predict (here 1, equal to zero or one)
- `num_nodes` (int): number of nodes of the graph
### Data Splits
This data comes from the PyGeometric version of the dataset.
This information can be found back using
```python
from torch_geometric.datasets import TUDataset
cur_dataset = TUDataset(root="../dataset/loaded/",
name="IMDB-BINARY")
```
## Additional Information
### Licensing Information
The dataset has been released under unknown license, please open an issue if you have this information.
### Citation Information
```
@inproceedings{10.1145/2783258.2783417,
author = {Yanardag, Pinar and Vishwanathan, S.V.N.},
title = {Deep Graph Kernels},
year = {2015},
isbn = {9781450336642},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/2783258.2783417},
doi = {10.1145/2783258.2783417},
abstract = {In this paper, we present Deep Graph Kernels, a unified framework to learn latent representations of sub-structures for graphs, inspired by latest advancements in language modeling and deep learning. Our framework leverages the dependency information between sub-structures by learning their latent representations. We demonstrate instances of our framework on three popular graph kernels, namely Graphlet kernels, Weisfeiler-Lehman subtree kernels, and Shortest-Path graph kernels. Our experiments on several benchmark datasets show that Deep Graph Kernels achieve significant improvements in classification accuracy over state-of-the-art graph kernels.},
booktitle = {Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining},
pages = {1365–1374},
numpages = {10},
keywords = {collaboration networks, bioinformatics, r-convolution kernels, graph kernels, structured data, deep learning, social networks, string kernels},
location = {Sydney, NSW, Australia},
series = {KDD '15}
}
```
### Contributions
Thanks to [@clefourrier](https://github.com/clefourrier) for adding this dataset. |
TrainingDataPro/plantations_segmentation | ---
license: cc-by-nd-4.0
task_categories:
- image-segmentation
language:
- en
tags:
- biology
- code
dataset_info:
features:
- name: image_id
dtype: int32
- name: image
dtype: image
- name: class_segmentation
dtype: image
- name: object_segmentation
dtype: image
- name: shapes
dtype: string
splits:
- name: train
num_bytes: 48297698
num_examples: 13
download_size: 48362120
dataset_size: 48297698
---
# Plantations Segmentation
The images consist of aerial photography of agricultural plantations with crops such as cabbage and zucchini. The dataset addresses agricultural tasks such as plant detection and counting, health assessment, and irrigation planning. The dataset consists of plantations' photographs with object and class segmentation of cabbage.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/agriculture-data-labeling?utm_source=huggingface&utm_medium=cpc&utm_campaign=plantations_segmentation) to discuss your requirements, learn about the price and buy the dataset.

# Dataset structure
- **Plantations_Segmentation** - contains of original plantation images (folder **img**) and file with annotations (.xml)
- **Object_Segmentation** - includes object segmentation masks for the original images
- **Class_Segmentation** - includes class segmentation masks for the original images
# Types of segmentation
The dataset includes two types of segmentation:
- **Class Segmentation** - objects corresponding to one class are identified
- **Object Segmentation** - all objects are identified separately
# Data Format
Each image from `img` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the polygons. For each point, the x and y coordinates are provided.
# Example of XML file structure
.png?generation=1685973058340642&alt=media)
# Plantation segmentation might be made in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/agriculture-data-labeling?utm_source=huggingface&utm_medium=cpc&utm_campaign=plantations_segmentation) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
dim/competition_math_selected | ---
license: mit
dataset_info:
features:
- name: problem
dtype: string
- name: level
dtype: string
- name: type
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 2332225.2
num_examples: 3000
download_size: 1217035
dataset_size: 2332225.2
---
|
bigscience-data/roots_indic-hi_wikiversity | ---
language: hi
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-hi_wikiversity
# wikiversity_filtered
- Dataset uid: `wikiversity_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0367 % of total
- 0.1050 % of en
- 0.1178 % of fr
- 0.1231 % of pt
- 0.0072 % of zh
- 0.0393 % of es
- 0.0076 % of ar
- 0.0069 % of indic-hi
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
CyberHarem/de_lisle_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of de_lisle/デ・リーズル/德利尔 (Girls' Frontline)
This is the dataset of de_lisle/デ・リーズル/德利尔 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are `green_eyes, long_hair, brown_hair, heterochromia, blue_eyes, mole, mole_under_eye, bangs, multicolored_hair, twintails, hair_ornament, earrings, horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 30.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_lisle_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 14.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_lisle_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 44 | 32.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_lisle_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 25.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_lisle_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 44 | 51.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_lisle_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/de_lisle_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, solo, open_mouth, black_gloves, piercing, collarbone, earbuds, simple_background, black_choker, blush, fingerless_gloves, holding, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | open_mouth | black_gloves | piercing | collarbone | earbuds | simple_background | black_choker | blush | fingerless_gloves | holding | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------|:---------------|:-----------|:-------------|:----------|:--------------------|:---------------|:--------|:--------------------|:----------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
somewheresystems/dataclysm-wikipedia | ---
license: cc-by-sa-3.0
language:
- en
pretty_name: dataclysm-wikipedia-titles
size_categories:
- 1M<n<10M
---
# somewheresystems/dataclysm-wikipedia
## USE THE NOTEBOOK TO GET STARTED!
https://github.com/somewheresystems/dataclysm
This dataset comprises of 6,458,670 English language Wikipedia articles, with an additional column added for title-embeddings using the bge-small-en-v1.5 embeddings model. The dataset was sourced here: https://huggingface.co/datasets/wikipedia/viewer/20220301.en
This dataset contains the full text of each Wikipedia article as of the date March 01, 2022. In comparison to somewheresystems/dataclysm-wikipedia-titles (68.93 GB), and the wikipedia-titles-lite dataset (49.72 GB), this entire dataset is only 16.32 GB uncompressed, which is 86.25% smaller and 63.18% smaller respectively.
# Embeddings Model
We used https://huggingface.co/BAAI/bge-small-en-v1.5 to embed the artcle `title` field. The purpose of using this model in particular was to leverage the ability to embed each title quickly while allowing for slightly more performant retrieval than `instruct-xl`.
# Why?
You can either load this entire dataset into a database and retrieve article text by similarity searches between queries and titles, link them to URLs and pull up-to-date articles, or pull the article text from March 01, 2022 from the dataset directly (included). For efficiency, we recommend dropping everything except the title, title embeddings, and URL to be able to quickly load and index information which can be used to efficiently pull the remaining information asynchronously via web.
# Citation Information
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
```
# Contributions
Thanks to @lewtun, @mariamabarham, @thomwolf, @lhoestq, @patrickvonplaten for adding the Wikipedia dataset in the first place.
## Contact
Please contact hi@dataclysm.xyz for inquiries. |
joey234/mmlu-college_biology-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6719
num_examples: 5
- name: test
num_bytes: 417471
num_examples: 144
download_size: 14610
dataset_size: 424190
---
# Dataset Card for "mmlu-college_biology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sheik21/rose | ---
license: openrail
---
|
dinaaaaaa/LIMA_instructions_generate | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 158731.98003327788
num_examples: 80
download_size: 141043
dataset_size: 158731.98003327788
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-e4791b21-302d-4702-9dba-a4a3a73498cd-118114 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification-not-evaluated
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification-not-evaluated
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
smallfish166/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 36256485
num_examples: 6502
download_size: 10383287
dataset_size: 36256485
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yuan-sf63/chenyu_label_0.5_96 | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
- name: '72'
dtype: int64
- name: '73'
dtype: int64
- name: '74'
dtype: int64
- name: '75'
dtype: int64
- name: '76'
dtype: int64
- name: '77'
dtype: int64
- name: '78'
dtype: int64
- name: '79'
dtype: int64
- name: '80'
dtype: int64
- name: '81'
dtype: int64
- name: '82'
dtype: int64
- name: '83'
dtype: int64
- name: '84'
dtype: int64
- name: '85'
dtype: int64
- name: '86'
dtype: int64
- name: '87'
dtype: int64
- name: '88'
dtype: int64
- name: '89'
dtype: int64
- name: '90'
dtype: int64
- name: '91'
dtype: int64
- name: '92'
dtype: int64
- name: '93'
dtype: int64
- name: '94'
dtype: int64
- name: '95'
dtype: int64
splits:
- name: train
num_bytes: 33328102.677381746
num_examples: 37825
- name: validation
num_bytes: 3703318.3226182545
num_examples: 4203
download_size: 0
dataset_size: 37031421.0
---
# Dataset Card for "chenyu_label_0.5_96"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vidannn/az-text2 | ---
license: apache-2.0
---
|
CodecSR/librispeech_asr_test_synth | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: original
num_bytes: 1238771045.0
num_examples: 5559
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 1237485326.125
num_examples: 5559
- name: academicodec_hifi_24k_320d
num_bytes: 1237485326.125
num_examples: 5559
- name: audiodec_24k_300d
num_bytes: 1240338486.125
num_examples: 5559
- name: audiodec_48k_300d_uni
num_bytes: 1240338486.125
num_examples: 5559
- name: dac_16k
num_bytes: 1238775910.125
num_examples: 5559
- name: dac_24k
num_bytes: 1238775910.125
num_examples: 5559
- name: dac_44k
num_bytes: 1238775910.125
num_examples: 5559
- name: encodec_24k_12bps
num_bytes: 1238775910.125
num_examples: 5559
- name: encodec_24k_1_5bps
num_bytes: 1238775910.125
num_examples: 5559
- name: encodec_24k_24bps
num_bytes: 1238775910.125
num_examples: 5559
- name: encodec_24k_3bps
num_bytes: 1238775910.125
num_examples: 5559
- name: encodec_24k_6bps
num_bytes: 1238775910.125
num_examples: 5559
- name: facodec_16k
num_bytes: 1238339086.125
num_examples: 5559
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 1238775910.125
num_examples: 5559
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 1238775910.125
num_examples: 5559
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 1238775910.125
num_examples: 5559
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 1238775910.125
num_examples: 5559
- name: language_codec_chinese_24k_nq8_12kbps
num_bytes: 1240065806.125
num_examples: 5559
- name: language_codec_paper_24k_nq8_12kbps
num_bytes: 1240065806.125
num_examples: 5559
- name: speech_tokenizer_16k
num_bytes: 1240065806.125
num_examples: 5559
download_size: 25399705699
dataset_size: 26018266095.5
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_300d
path: data/audiodec_24k_300d-*
- split: audiodec_48k_300d_uni
path: data/audiodec_48k_300d_uni-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: facodec_16k
path: data/facodec_16k-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: language_codec_chinese_24k_nq8_12kbps
path: data/language_codec_chinese_24k_nq8_12kbps-*
- split: language_codec_paper_24k_nq8_12kbps
path: data/language_codec_paper_24k_nq8_12kbps-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-63000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1002435
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edbeeching/prj_gia_dataset_atari_2B_atari_videopinball_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_videopinball environment, sample for the policy atari_2B_atari_videopinball_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
atmallen/quirky_bookrating_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 96081.48302415121
num_examples: 714
- name: validation
num_bytes: 65429.5725
num_examples: 486
- name: test
num_bytes: 68424.7355
num_examples: 506
download_size: 78400
dataset_size: 229935.7910241512
---
# Dataset Card for "quirky_bookrating_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cartinoe5930/few-shot-qwen-7b | ---
dataset_info:
features:
- name: response
dtype: string
- name: predictied_answer
dtype: int64
- name: actual_answer
dtype: int64
splits:
- name: train
num_bytes: 439227
num_examples: 1319
download_size: 202218
dataset_size: 439227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Chr0my/freesound.org | ---
language:
- en
tags:
- music
size_categories:
- 100K<n<1M
---
This dataset has been scraped from https://freesound.org
Containing 554849 audio clips.
License: cc-by-sa-3.0, https://creativecommons.org/licenses/by-sa/3.0/ |
dinesht/sample_dataset | ---
license: unknown
---
|
SuperSecureHuman/vendata-code | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 41265263
num_examples: 9129
download_size: 15639176
dataset_size: 41265263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Brendan/nlp244_french_snli | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: fr_premise
dtype: string
- name: fr_hypothesis
dtype: string
splits:
- name: test
num_bytes: 2298242
num_examples: 10000
- name: train
num_bytes: 122710788
num_examples: 550152
- name: validation
num_bytes: 2305275
num_examples: 10000
download_size: 40406975
dataset_size: 127314305
---
# Dataset Card for "nlp244_french_snli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BEE-spoke-data/survivorslib-law-books | ---
dataset_info:
features:
- name: section
dtype: string
- name: filename
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 73734751.97826087
num_examples: 43
- name: validation
num_bytes: 1714761.6739130435
num_examples: 1
- name: test
num_bytes: 3429523.347826087
num_examples: 2
download_size: 42120770
dataset_size: 78879037.00000001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
license: odc-by
task_categories:
- text-generation
- fill-mask
language:
- en
size_categories:
- n<1K
---
# law books (nougat-small)
A decent chunk of: https://www.survivorlibrary.com/index.php/8-category/173-library-law
<pre>(ki) <font color="#859900"><b>➜ </b></font><font color="#2AA198"><b>primerdata-for-LLMs</b></font> python push_dataset_from_text.py /home/pszemraj/Dropbox/programming-projects/primerdata-for-LLMs/utils/output-hf-nougat-space/law -e .md -r BEE-spoke-data/survivorslib-law-books
INFO:__main__:Looking for files with extensions: ['md']
Processing md files: 100%|███████████████████████████████| 46/46 [00:00<00:00, 778.32it/s]
INFO:__main__:Found 46 text files.
INFO:__main__:Performing train-test split...
INFO:__main__:Performing validation-test split...
INFO:__main__:Train size: 43
INFO:__main__:Validation size: 1
INFO:__main__:Test size: 2
INFO:__main__:Pushing dataset</pre> |
open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.1 | ---
pretty_name: Evaluation run of lamhieu/ghost-7b-v0.9.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lamhieu/ghost-7b-v0.9.1](https://huggingface.co/lamhieu/ghost-7b-v0.9.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T12:21:15.645809](https://huggingface.co/datasets/open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.1/blob/main/results_2024-02-22T12-21-15.645809.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5461963762782457,\n\
\ \"acc_stderr\": 0.0341839674901544,\n \"acc_norm\": 0.5516542798358821,\n\
\ \"acc_norm_stderr\": 0.034909524738250194,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43956086098918057,\n\
\ \"mc2_stderr\": 0.015308355019122989\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536595,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.01452670554853998\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5828520215096594,\n\
\ \"acc_stderr\": 0.004920800313232742,\n \"acc_norm\": 0.770264887472615,\n\
\ \"acc_norm_stderr\": 0.004198027272982672\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873502,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873502\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.02692344605930284,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.02692344605930284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916647,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916647\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954904,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954904\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.0323854694875898,\n \
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.0323854694875898\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.019109299846098292,\n \"\
acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098292\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7205882352941176,\n \"acc_stderr\": 0.03149328104507955,\n \"\
acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.03149328104507955\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334383,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334383\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7177522349936143,\n\
\ \"acc_stderr\": 0.01609530296987855,\n \"acc_norm\": 0.7177522349936143,\n\
\ \"acc_norm_stderr\": 0.01609530296987855\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.02668013476167922,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.02668013476167922\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.014614465821966332,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.014614465821966332\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413324,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3859191655801825,\n\
\ \"acc_stderr\": 0.012433398911476148,\n \"acc_norm\": 0.3859191655801825,\n\
\ \"acc_norm_stderr\": 0.012433398911476148\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \
\ \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872478,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872478\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.43956086098918057,\n\
\ \"mc2_stderr\": 0.015308355019122989\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117192\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26914329037149354,\n \
\ \"acc_stderr\": 0.012216595457292733\n }\n}\n```"
repo_url: https://huggingface.co/lamhieu/ghost-7b-v0.9.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-21-15.645809.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-21-15.645809.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- '**/details_harness|winogrande|5_2024-02-22T12-21-15.645809.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T12-21-15.645809.parquet'
- config_name: results
data_files:
- split: 2024_02_22T12_21_15.645809
path:
- results_2024-02-22T12-21-15.645809.parquet
- split: latest
path:
- results_2024-02-22T12-21-15.645809.parquet
---
# Dataset Card for Evaluation run of lamhieu/ghost-7b-v0.9.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lamhieu/ghost-7b-v0.9.1](https://huggingface.co/lamhieu/ghost-7b-v0.9.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T12:21:15.645809](https://huggingface.co/datasets/open-llm-leaderboard/details_lamhieu__ghost-7b-v0.9.1/blob/main/results_2024-02-22T12-21-15.645809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5461963762782457,
"acc_stderr": 0.0341839674901544,
"acc_norm": 0.5516542798358821,
"acc_norm_stderr": 0.034909524738250194,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.43956086098918057,
"mc2_stderr": 0.015308355019122989
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536595,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.01452670554853998
},
"harness|hellaswag|10": {
"acc": 0.5828520215096594,
"acc_stderr": 0.004920800313232742,
"acc_norm": 0.770264887472615,
"acc_norm_stderr": 0.004198027272982672
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873502,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873502
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.02692344605930284,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.02692344605930284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916647,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916647
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954904,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954904
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.0323854694875898,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.0323854694875898
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.019109299846098292,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.019109299846098292
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.03149328104507955,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.03149328104507955
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334383,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334383
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7177522349936143,
"acc_stderr": 0.01609530296987855,
"acc_norm": 0.7177522349936143,
"acc_norm_stderr": 0.01609530296987855
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.02668013476167922,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.02668013476167922
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966332,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966332
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581986,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581986
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413324,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3859191655801825,
"acc_stderr": 0.012433398911476148,
"acc_norm": 0.3859191655801825,
"acc_norm_stderr": 0.012433398911476148
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5408496732026143,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.5408496732026143,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872478,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872478
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.43956086098918057,
"mc2_stderr": 0.015308355019122989
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.012544516005117192
},
"harness|gsm8k|5": {
"acc": 0.26914329037149354,
"acc_stderr": 0.012216595457292733
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mHossain/merge_new_para_detection_data_v2.csv | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: int64
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 53374443.81070641
num_examples: 250453
- name: test
num_bytes: 5930683.189293594
num_examples: 27829
download_size: 24871187
dataset_size: 59305127.0
---
# Dataset Card for "merge_new_para_detection_data_v2.csv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
babs/nigerian-accented-english | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
splits:
- name: train
num_bytes: 3182575083.5629635
num_examples: 3453
- name: test
num_bytes: 643515185.1730368
num_examples: 864
download_size: 3313267272
dataset_size: 3826090268.736
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_mwitiderrick__SwahiliInstruct-v0.2 | ---
pretty_name: Evaluation run of mwitiderrick/SwahiliInstruct-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mwitiderrick/SwahiliInstruct-v0.2](https://huggingface.co/mwitiderrick/SwahiliInstruct-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mwitiderrick__SwahiliInstruct-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T17:14:24.591374](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__SwahiliInstruct-v0.2/blob/main/results_2024-01-10T17-14-24.591374.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5020612158374184,\n\
\ \"acc_stderr\": 0.03431570224894014,\n \"acc_norm\": 0.5085604196260874,\n\
\ \"acc_norm_stderr\": 0.03510491450294754,\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5708474256962726,\n\
\ \"mc2_stderr\": 0.015744185818785193\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.514505119453925,\n \"acc_stderr\": 0.014605241081370056,\n\
\ \"acc_norm\": 0.5520477815699659,\n \"acc_norm_stderr\": 0.014532011498211678\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5935072694682334,\n\
\ \"acc_stderr\": 0.004901747426331732,\n \"acc_norm\": 0.7822146982672774,\n\
\ \"acc_norm_stderr\": 0.004118971487050471\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456344,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456344\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340354,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340354\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.0248708152510571,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.0248708152510571\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.42258064516129035,\n\
\ \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.42258064516129035,\n\
\ \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806587,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806587\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626301,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626301\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.033088185944157494,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.033088185944157494\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.41794871794871796,\n \"acc_stderr\": 0.02500732988246122,\n\
\ \"acc_norm\": 0.41794871794871796,\n \"acc_norm_stderr\": 0.02500732988246122\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.655045871559633,\n\
\ \"acc_stderr\": 0.020380605405066952,\n \"acc_norm\": 0.655045871559633,\n\
\ \"acc_norm_stderr\": 0.020380605405066952\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6862745098039216,\n \"acc_stderr\": 0.03256685484460389,\n \"\
acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.03256685484460389\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700916,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700916\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6730523627075351,\n\
\ \"acc_stderr\": 0.016774908180131474,\n \"acc_norm\": 0.6730523627075351,\n\
\ \"acc_norm_stderr\": 0.016774908180131474\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5635838150289018,\n \"acc_stderr\": 0.026700545424943677,\n\
\ \"acc_norm\": 0.5635838150289018,\n \"acc_norm_stderr\": 0.026700545424943677\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n\
\ \"acc_stderr\": 0.028237769422085335,\n \"acc_norm\": 0.5530546623794212,\n\
\ \"acc_norm_stderr\": 0.028237769422085335\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3878748370273794,\n\
\ \"acc_stderr\": 0.012444998309675617,\n \"acc_norm\": 0.3878748370273794,\n\
\ \"acc_norm_stderr\": 0.012444998309675617\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877743,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087558,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4925373134328358,\n\
\ \"acc_stderr\": 0.03535140084276719,\n \"acc_norm\": 0.4925373134328358,\n\
\ \"acc_norm_stderr\": 0.03535140084276719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5708474256962726,\n\
\ \"mc2_stderr\": 0.015744185818785193\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.01244171845689301\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11448066717210008,\n \
\ \"acc_stderr\": 0.008770157532110507\n }\n}\n```"
repo_url: https://huggingface.co/mwitiderrick/SwahiliInstruct-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|arc:challenge|25_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|gsm8k|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hellaswag|10_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-14-24.591374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T17-14-24.591374.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- '**/details_harness|winogrande|5_2024-01-10T17-14-24.591374.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T17-14-24.591374.parquet'
- config_name: results
data_files:
- split: 2024_01_10T17_14_24.591374
path:
- results_2024-01-10T17-14-24.591374.parquet
- split: latest
path:
- results_2024-01-10T17-14-24.591374.parquet
---
# Dataset Card for Evaluation run of mwitiderrick/SwahiliInstruct-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mwitiderrick/SwahiliInstruct-v0.2](https://huggingface.co/mwitiderrick/SwahiliInstruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mwitiderrick__SwahiliInstruct-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T17:14:24.591374](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__SwahiliInstruct-v0.2/blob/main/results_2024-01-10T17-14-24.591374.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5020612158374184,
"acc_stderr": 0.03431570224894014,
"acc_norm": 0.5085604196260874,
"acc_norm_stderr": 0.03510491450294754,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5708474256962726,
"mc2_stderr": 0.015744185818785193
},
"harness|arc:challenge|25": {
"acc": 0.514505119453925,
"acc_stderr": 0.014605241081370056,
"acc_norm": 0.5520477815699659,
"acc_norm_stderr": 0.014532011498211678
},
"harness|hellaswag|10": {
"acc": 0.5935072694682334,
"acc_stderr": 0.004901747426331732,
"acc_norm": 0.7822146982672774,
"acc_norm_stderr": 0.004118971487050471
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456344,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456344
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340354,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340354
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.0248708152510571,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.0248708152510571
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.42258064516129035,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.42258064516129035,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806587,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806587
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626301,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626301
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.033088185944157494,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.033088185944157494
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41794871794871796,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.41794871794871796,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.655045871559633,
"acc_stderr": 0.020380605405066952,
"acc_norm": 0.655045871559633,
"acc_norm_stderr": 0.020380605405066952
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.03256685484460389,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.03256685484460389
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700916,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700916
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6730523627075351,
"acc_stderr": 0.016774908180131474,
"acc_norm": 0.6730523627075351,
"acc_norm_stderr": 0.016774908180131474
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5635838150289018,
"acc_stderr": 0.026700545424943677,
"acc_norm": 0.5635838150289018,
"acc_norm_stderr": 0.026700545424943677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.028237769422085335,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.028237769422085335
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3878748370273794,
"acc_stderr": 0.012444998309675617,
"acc_norm": 0.3878748370273794,
"acc_norm_stderr": 0.012444998309675617
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877743,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4925373134328358,
"acc_stderr": 0.03535140084276719,
"acc_norm": 0.4925373134328358,
"acc_norm_stderr": 0.03535140084276719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5708474256962726,
"mc2_stderr": 0.015744185818785193
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.01244171845689301
},
"harness|gsm8k|5": {
"acc": 0.11448066717210008,
"acc_stderr": 0.008770157532110507
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Multimodal-Fatima/TinyImagenet_validation | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': goldfish
'1': fire salamander
'2': american bullfrog
'3': tailed frog
'4': american alligator
'5': boa constrictor
'6': trilobite
'7': scorpion
'8': southern black widow
'9': tarantula
'10': centipede
'11': koala
'12': jellyfish
'13': brain coral
'14': snail
'15': sea slug
'16': american lobster
'17': spiny lobster
'18': black stork
'19': king penguin
'20': albatross
'21': dugong
'22': yorkshire terrier
'23': golden retriever
'24': labrador retriever
'25': german shepherd dog
'26': standard poodle
'27': tabby cat
'28': persian cat
'29': egyptian mau
'30': cougar
'31': lion
'32': brown bear
'33': ladybug
'34': grasshopper
'35': stick insect
'36': cockroach
'37': praying mantis
'38': dragonfly
'39': monarch butterfly
'40': sulphur butterfly
'41': sea cucumber
'42': guinea pig
'43': pig
'44': ox
'45': bison
'46': bighorn sheep
'47': gazelle
'48': arabian camel
'49': orangutan
'50': chimpanzee
'51': baboon
'52': african bush elephant
'53': red panda
'54': abacus
'55': academic gown
'56': altar
'57': backpack
'58': baluster / handrail
'59': barbershop
'60': barn
'61': barrel
'62': basketball
'63': bathtub
'64': station wagon
'65': lighthouse
'66': beaker
'67': beer bottle
'68': bikini
'69': binoculars
'70': birdhouse
'71': bow tie
'72': brass memorial plaque
'73': bucket
'74': high speed train
'75': butcher shop
'76': candle
'77': cannon
'78': cardigan
'79': automated teller machine
'80': cd player
'81': storage chest
'82': christmas stocking
'83': cliff dwelling
'84': computer keyboard
'85': candy store
'86': convertible
'87': crane bird
'88': dam
'89': desk
'90': dining table
'91': dumbbell
'92': flagpole
'93': fly
'94': fountain
'95': freight car
'96': frying pan
'97': fur coat
'98': gas mask or respirator
'99': go kart
'100': gondola
'101': hourglass
'102': ipod
'103': rickshaw
'104': kimono
'105': lampshade
'106': lawn mower
'107': lifeboat
'108': limousine
'109': magnetic compass
'110': maypole
'111': military uniform
'112': miniskirt
'113': moving van
'114': neck brace
'115': obelisk
'116': oboe
'117': pipe organ
'118': parking meter
'119': payphone
'120': picket fence
'121': pill bottle
'122': plunger
'123': police van
'124': poncho
'125': soda bottle
'126': potter's wheel
'127': missile
'128': punching bag
'129': refrigerator
'130': remote control
'131': rocking chair
'132': rugby ball
'133': sandal
'134': school bus
'135': scoreboard
'136': sewing machine
'137': snorkel
'138': sock
'139': sombrero
'140': space heater
'141': spider web
'142': sports car
'143': through arch bridge
'144': stopwatch
'145': sunglasses
'146': suspension bridge
'147': swim trunks / shorts
'148': syringe
'149': teapot
'150': teddy bear
'151': thatched roof
'152': torch
'153': tractor
'154': triumphal arch
'155': trolleybus
'156': turnstile
'157': umbrella
'158': vestment
'159': viaduct
'160': volleyball
'161': water jug
'162': water tower
'163': wok
'164': wooden spoon
'165': comic book
'166': fishing casting reel
'167': guacamole
'168': ice cream
'169': popsicle
'170': goose
'171': drumstick
'172': plate
'173': pretzel
'174': mashed potatoes
'175': cauliflower
'176': bell pepper
'177': lemon
'178': banana
'179': pomegranate
'180': meatloaf
'181': pizza
'182': pot pie
'183': espresso
'184': bee
'185': apron
'186': pole
'187': chihuahua
'188': mountain
'189': cliff
'190': coral reef
'191': lakeshore
'192': beach
'193': acorn
'194': broom
'195': mushroom
'196': metal nail
'197': chain
'198': slug
'199': orange
- name: id
dtype: int64
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_ViT_L_14_ensemble_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
splits:
- name: validation
num_bytes: 25528461.0
num_examples: 10000
download_size: 15791743
dataset_size: 25528461.0
---
# Dataset Card for "TinyImagenet_validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haseong8012/child-10k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: audio
sequence: float32
splits:
- name: train
num_bytes: 2077216016
num_examples: 10000
download_size: 1810220972
dataset_size: 2077216016
---
# Dataset Card for "korean-child-command-voice_train-0-10000_smaplingRate-16000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alpayariyak/IAM_Sentences_LLaVA | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: string
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 1053875995.077
num_examples: 5663
download_size: 1128902513
dataset_size: 1053875995.077
---
# Dataset Card for "IAM_Sentences_LLaVA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.