datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
autoevaluate/autoeval-eval-futin__feed-top_en-c0540d-2175569969 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-66b
metrics: []
dataset_name: futin/feed
dataset_config: top_en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-66b
* Dataset: futin/feed
* Config: top_en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
proserve/medical-instruct-mixer-v2 | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 463565438.0
num_examples: 482593
- name: test
num_bytes: 74684196.0
num_examples: 40159
download_size: 278795913
dataset_size: 538249634.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Ti-Ma/wikipedia_2018 | ---
license: cc-by-sa-3.0
---
|
lim4349/korquad | ---
dataset_info:
features:
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: id
dtype: string
- name: answers
struct:
- name: text
sequence: string
- name: answer_start
sequence: int64
splits:
- name: train
num_bytes: 75266074
num_examples: 54366
- name: validation
num_bytes: 8358264
num_examples: 6041
download_size: 51472501
dataset_size: 83624338
---
# Dataset Card for "korquad"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/vertebral_column | ---
language:
- en
tags:
- vertebral_column
- tabular_classification
- binary_classification
- UCI
pretty_name: Vertebral Column
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- vertebral
license: cc
---
# Vertebral Column
The [Vertebral Column dataset](https://archive.ics.uci.edu/ml/datasets/vertebral+column) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|-------------------------|
| abnormal | Binary classification | Is the spine abnormal?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/vertebral_column")["train"]
``` |
torchgeo/l8biome | ---
task_categories:
- image-segmentation
tags:
- climate
pretty_name: L8 Biome
size_categories:
- n<1K
license: cc0-1.0
---
Redistribution of data from https://landsat.usgs.gov/landsat-8-cloud-cover-assessment-validation-data, masks modified to add georeferencing metadata.
Landsat Data Distribution Policy: https://www.usgs.gov/media/files/landsat-data-distribution-policy |
akahana/oscar-unshuffled_deduplicated_id_10k | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 18527241
num_examples: 10000
download_size: 10371685
dataset_size: 18527241
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oscar-unshuffled_deduplicated_id_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kaludi/data-food-category-classification | ---
task_categories:
- image-classification
---
# Dataset for project: food-category-classification
## Dataset Description
This dataset is for project food-category-classification.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<512x512 RGB PIL image>",
"target": 0
},
{
"image": "<512x512 RGB PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['Bread', 'Dairy product', 'Dessert', 'Egg', 'Fried food', 'Meat', 'Noodles-Pasta', 'Rice', 'Seafood', 'Soup', 'Vegetable-Fruit'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1210 |
| valid | 275 |
|
EkBass/fin-eng-dataset | ---
license: gpl-3.0
task_categories:
- translation
language:
- fi
- en
tags:
- text
- translation
- finnish
- english
pretty_name: fin-eng-dataset-6k
---
# fin-eng-dataset
# Updated 29th October 2023
New version. Covers around 30K individual words and around 10K sentences, phrases etc.
# Updated 19th September 2023
New version. Over 20K unique words and over 2K sentences/paragraphs fin-eng versions.
# Updated 10th September 2023
Updated version.
Around 15K different words and a couple of thousands of sentences, paragraphs, quots, questions and answers.
# English
The file fine-eng-dataset.json contains over 9000 individual Finnish words with their English translations. Since some of the words are names of places, people, etc., the exact number of Finnish words is unknown.
Part of the data includes a list of Finnish words along with their English translations. However, the majority of the data consists of Finnish sentences, questions, statements, etc., that have been translated into English.
The data begins with a list of the thousand most common Finnish words with their translations. Following that are sentences, including quotes from Martti Ahtisaari, Public Domain books like "Open Life," Maila Talvio's "The Destruction of Dark Cabin," as well as sentences from free novellas "Midsummer Gift for Readers" and "Erotic Novella: Towards Malaysia."
In addition, sentences, quotes from movies, basic sentences produced by artificial intelligence, personal messages, etc., have been added, totaling over a thousand entries. Random paragraphs from Finnish Wikipedia's "random article" have also been included.
The work is intended to continue indefinitely. Help is needed; please contact krisu.virtanen@gmail.com.
# Suomeksi
fine-eng-dataset.json sisältää yli 9000 yksittäistä suomenkielistä sanaa englanninkielisenä käännöksenään. Koska osa sanoista on paikkojen-, ihmisten-, jne, nimiä niin tarkkaa määrää suomenkielisestä sanoista ei tiedetä.
Osassa dataa on syötetty lista suomenkielisiä sanoja sekä niiden englanninkieliset käännökset. Suurin osa datasta on kuitenkin suomenkielisiä lauseita, kysymyksiä, toteamuksia jne. jotka on käännetty englanniksi.
Data alkaa luettelolla tuhannesta yleisimmästä suomenkilisestä sanasta käännöksineen. Tämän jälkeen tulee lauseita, mm. lainauksia Martti Ahtesaaresta, Public Domain kirjoista "Avoin Elämä", Maila Talvion "Pimeänpirtin hävitys", sekä lauseita ilmaisista novelleista "Juhannustalahja lukijoille" ja "Erottiinen novelli: Kohti Malesiaa".
Lisäksi on syötetty lauseita, lainauksia elokuvista, tekoälyn tuottamia peruslauseita, omia viestejä jne. kaiken kaikkiaan yli tuhannen kappaleen verran sekä otettu satunnaisia kappaleita suomenkielisestä wikipediasta "satunnainen artikkeli".
Tarkoitus on jatkaa työtä toistaiseksi. Apua tarvitaan, ota yhteyttä krisu.virtanen@gmail.com |
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5](https://huggingface.co/BFauber/lora_llama2-13b_10e5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T01:54:15.995961](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5/blob/main/results_2024-02-10T01-54-15.995961.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5557440720682312,\n\
\ \"acc_stderr\": 0.03358121479787839,\n \"acc_norm\": 0.5618325027332456,\n\
\ \"acc_norm_stderr\": 0.03430489410692684,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.37646299641377995,\n\
\ \"mc2_stderr\": 0.013743052527776188\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449703\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.616211909978092,\n\
\ \"acc_stderr\": 0.004853134271547769,\n \"acc_norm\": 0.8241386178052181,\n\
\ \"acc_norm_stderr\": 0.003799241408502968\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236397,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307702,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307702\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042774,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042774\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.02534800603153477,\n \
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.02534800603153477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501617,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501617\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n\
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.043642261558410445,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.043642261558410445\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260666,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260666\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159607,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159607\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.026981478043648036,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.026981478043648036\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722327,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144373,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144373\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.012618204066588392,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.012618204066588392\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5506535947712419,\n \"acc_stderr\": 0.02012376652802727,\n \
\ \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.02012376652802727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.03113088039623593,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.03113088039623593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.37646299641377995,\n\
\ \"mc2_stderr\": 0.013743052527776188\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836671\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.221379833206975,\n \
\ \"acc_stderr\": 0.01143600000425351\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-54-15.995961.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-54-15.995961.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- '**/details_harness|winogrande|5_2024-02-10T01-54-15.995961.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T01-54-15.995961.parquet'
- config_name: results
data_files:
- split: 2024_02_10T01_54_15.995961
path:
- results_2024-02-10T01-54-15.995961.parquet
- split: latest
path:
- results_2024-02-10T01-54-15.995961.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5](https://huggingface.co/BFauber/lora_llama2-13b_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T01:54:15.995961](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5/blob/main/results_2024-02-10T01-54-15.995961.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5557440720682312,
"acc_stderr": 0.03358121479787839,
"acc_norm": 0.5618325027332456,
"acc_norm_stderr": 0.03430489410692684,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.37646299641377995,
"mc2_stderr": 0.013743052527776188
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449703
},
"harness|hellaswag|10": {
"acc": 0.616211909978092,
"acc_stderr": 0.004853134271547769,
"acc_norm": 0.8241386178052181,
"acc_norm_stderr": 0.003799241408502968
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.04060127035236397,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.04060127035236397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307702,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307702
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042774,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042774
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512566,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512566
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.02534800603153477,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.02534800603153477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501617,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501617
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.043642261558410445,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.043642261558410445
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895806,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260666,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159607,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159607
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648036,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648036
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722327,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144373,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144373
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.012618204066588392,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.012618204066588392
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.03113088039623593,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.03113088039623593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.37646299641377995,
"mc2_stderr": 0.013743052527776188
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836671
},
"harness|gsm8k|5": {
"acc": 0.221379833206975,
"acc_stderr": 0.01143600000425351
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/sona_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sona (League of Legends)
This is the dataset of sona (League of Legends), containing 500 images and their tags.
The core tags of this character are `long_hair, twintails, breasts, large_breasts, blue_hair, blue_eyes, very_long_hair, aqua_hair, hair_ornament, multicolored_hair, gradient_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 748.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sona_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 425.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sona_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1150 | 860.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sona_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 658.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sona_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1150 | 1.20 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sona_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sona_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, solo, instrument, dress, lips |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, collarbone, solo, upper_body, bangs, blush, looking_at_viewer, simple_background, white_background, closed_mouth, low_neckline, smile, blue_dress |
| 2 | 6 |  |  |  |  |  | 1girl, cleavage, necklace, solo, star_(symbol), midriff, navel, earrings, fingerless_gloves, looking_at_viewer, bra, green_gloves, purple_hair, smile |
| 3 | 6 |  |  |  |  |  | 1girl, nipples, nude, pussy, solo, aqua_eyes, blush, looking_at_viewer, navel, on_back, uncensored, bed_sheet, green_eyes, smile |
| 4 | 10 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, solo_focus, penis, blush, cum, nude, collarbone, huge_breasts, paizuri, bare_shoulders, blonde_hair, male_pubic_hair, parted_lips, smile, uncensored |
| 5 | 5 |  |  |  |  |  | 1girl, black_panties, looking_at_viewer, solo, black_bra, black_thighhighs, blonde_hair, garter_belt, garter_straps, blush, cleavage, collarbone, huge_breasts, skindentation, ass, bare_shoulders, curvy, hair_between_eyes, lingerie, looking_back, navel, open_mouth, parted_lips, simple_background, thick_thighs, thigh_gap, underwear_only |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | cleavage | solo | instrument | dress | lips | collarbone | upper_body | bangs | blush | looking_at_viewer | simple_background | white_background | closed_mouth | low_neckline | smile | blue_dress | necklace | star_(symbol) | midriff | navel | earrings | fingerless_gloves | bra | green_gloves | purple_hair | nipples | nude | pussy | aqua_eyes | on_back | uncensored | bed_sheet | green_eyes | 1boy | hetero | solo_focus | penis | cum | huge_breasts | paizuri | blonde_hair | male_pubic_hair | parted_lips | black_panties | black_bra | black_thighhighs | garter_belt | garter_straps | skindentation | ass | curvy | hair_between_eyes | lingerie | looking_back | open_mouth | thick_thighs | thigh_gap | underwear_only |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:-------|:-------------|:--------|:-------|:-------------|:-------------|:--------|:--------|:--------------------|:--------------------|:-------------------|:---------------|:---------------|:--------|:-------------|:-----------|:----------------|:----------|:--------|:-----------|:--------------------|:------|:---------------|:--------------|:----------|:-------|:--------|:------------|:----------|:-------------|:------------|:-------------|:-------|:---------|:-------------|:--------|:------|:---------------|:----------|:--------------|:------------------|:--------------|:----------------|:------------|:-------------------|:--------------|:----------------|:----------------|:------|:--------|:--------------------|:-----------|:---------------|:-------------|:---------------|:------------|:-----------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | | | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | X | | | | | | | X | X | | | | | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | | | | | X | | | X | | | | | | X | | | | | | | | | | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | | | | X | | | X | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf | ---
pretty_name: Evaluation run of ehartford/CodeLlama-34b-Python-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/CodeLlama-34b-Python-hf](https://huggingface.co/ehartford/CodeLlama-34b-Python-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T22:02:41.600326](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf/blob/main/results_2023-09-17T22-02-41.600326.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931190325,\n \"f1\": 0.0019200922818791944,\n\
\ \"f1_stderr\": 0.0004138356823487018,\n \"acc\": 0.3307024467245462,\n\
\ \"acc_stderr\": 0.006650084932921209\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190325,\n\
\ \"f1\": 0.0019200922818791944,\n \"f1_stderr\": 0.0004138356823487018\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6614048934490924,\n\
\ \"acc_stderr\": 0.013300169865842417\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/CodeLlama-34b-Python-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T22_02_41.600326
path:
- '**/details_harness|drop|3_2023-09-17T22-02-41.600326.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T22-02-41.600326.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T22_02_41.600326
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-02-41.600326.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-02-41.600326.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T22_02_41.600326
path:
- '**/details_harness|winogrande|5_2023-09-17T22-02-41.600326.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T22-02-41.600326.parquet'
- config_name: results
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- results_2023-08-26T01:57:15.339948.parquet
- split: 2023_09_17T22_02_41.600326
path:
- results_2023-09-17T22-02-41.600326.parquet
- split: latest
path:
- results_2023-09-17T22-02-41.600326.parquet
---
# Dataset Card for Evaluation run of ehartford/CodeLlama-34b-Python-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/CodeLlama-34b-Python-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/CodeLlama-34b-Python-hf](https://huggingface.co/ehartford/CodeLlama-34b-Python-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T22:02:41.600326](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf/blob/main/results_2023-09-17T22-02-41.600326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190325,
"f1": 0.0019200922818791944,
"f1_stderr": 0.0004138356823487018,
"acc": 0.3307024467245462,
"acc_stderr": 0.006650084932921209
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190325,
"f1": 0.0019200922818791944,
"f1_stderr": 0.0004138356823487018
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6614048934490924,
"acc_stderr": 0.013300169865842417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
erfanzar/UltraChat-Mini | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: dialog
sequence: string
- name: user
sequence: string
- name: assistant
sequence: string
- name: system
dtype: string
- name: id
dtype: int64
- name: llama2_prompt
dtype: string
splits:
- name: train
num_bytes: 6005323184
num_examples: 239641
download_size: 2964129142
dataset_size: 6005323184
---
# Dataset Card for "UltraChat-Mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adityarra07/test_ds_uwb_atc_noise_trial | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 684273363.9015728
num_examples: 3000
- name: test
num_bytes: 22809112.13005243
num_examples: 100
download_size: 709812743
dataset_size: 707082476.0316253
---
# Dataset Card for "test_ds_uwb_atc_noise_trial"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
enoahjr/twitter_dataset_1713189012 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 992034
num_examples: 2929
download_size: 542763
dataset_size: 992034
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/VALUE_mrpc_lexical | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 402763
num_examples: 1493
- name: train
num_bytes: 849076
num_examples: 3134
- name: validation
num_bytes: 97832
num_examples: 360
download_size: 921493
dataset_size: 1349671
---
# Dataset Card for "VALUE_mrpc_lexical"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Areeb123/drug_reviews | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- medical
size_categories:
- 1M<n<10M
--- |
aintech/vdf_wolt_food |
---
tags:
- vdf
- vector-io
- vector-dataset
- vector-embeddings
---
This is a dataset created using [vector-io](https://github.com/ai-northstar-tech/vector-io)
|
tramzel/fndds | ---
license: unknown
---
|
RuudVelo/my_awesome_new_bike | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 29796713.0
num_examples: 10
download_size: 26771158
dataset_size: 29796713.0
---
|
betterMateusz/SAT_Writting_Reading_Assessment_Question_Bank | ---
language:
- en
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: passage
dtype: string
- name: question
dtype: string
- name: choice_A
dtype: string
- name: choice_B
dtype: string
- name: choice_C
dtype: string
- name: choice_D
dtype: string
- name: correct_answer
dtype: string
- name: rationale
dtype: string
- name: difficulty
dtype: string
- name: domain
dtype: string
splits:
- name: train
num_bytes: 706258.0
num_examples: 397
download_size: 346011
dataset_size: 706258.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for SAT Reading and Writing Dataset
This dataset card aims to be a base template for the SAT Reading and Writing Dataset, optimized for use with Hugging Face's datasets library.
## Dataset Details
### Dataset Description
This dataset contains SAT Reading and Writing assessment questions sourced from the College Board's SAT Suite Question Bank, intended for use in training and evaluating Language Models like LLMs.
- **Curated by:** College Board
- **License:** Creative Commons Attribution-ShareAlike 4.0 International License
### Dataset Sources
- **Repository:** [College Board SAT Suite Question Bank](https://satsuitequestionbank.collegeboard.org/)
## Uses
### Direct Use
The dataset can be used for SAT exam preparation and educational purposes. It is suitable for training and evaluating Language Models for SAT-style reading comprehension and writing tasks.
## Dataset Structure
The dataset contains questions with passages, choices, correct answers, and rationales, making it ideal for training and evaluating Language Models on SAT-style reading comprehension and writing tasks.
## Dataset Creation
### Curation Rationale
The dataset was created to provide a resource for students preparing for the SAT exam and for researchers and developers working on Natural Language Processing tasks related to standardized testing.
### Source Data
#### Data Collection and Processing
The questions were extracted from the College Board SAT Suite Question Bank using automated scraping and filtering processes.
#### Who are the source data producers?
The College Board is the producer of the source data.
## Bias, Risks, and Limitations
Users should be aware of the limitations of using this dataset for predicting SAT scores and should use it in conjunction with other resources for a comprehensive SAT preparation.
### Recommendations
Users should use this dataset as a supplementary resource for SAT exam preparation and as a benchmark for evaluating the performance of Language Models on SAT-style reading comprehension and writing tasks.
## Citation
**APA:**
College Board. SAT Reading and Writing Dataset. Retrieved from [College Board SAT Suite Question Bank](https://satsuitequestionbank.collegeboard.org/)
|
YBXL/JAMA_Reasoning_test_Rare_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 334270
num_examples: 250
- name: valid
num_bytes: 334270
num_examples: 250
- name: test
num_bytes: 334270
num_examples: 250
download_size: 474801
dataset_size: 1002810
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
jlbaker361/actstu-dream-50 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: int64
- name: steps
dtype: int64
splits:
- name: train
num_bytes: 29844939.0
num_examples: 28
download_size: 29847370
dataset_size: 29844939.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
halaction/atm-data-transformers | ---
license: openrail
---
|
allenai/ms2_dense_max | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-MS^2
- extended|other-Cochrane
task_categories:
- summarization
- text2text-generation
paperswithcode_id: multi-document-summarization
pretty_name: MSLR Shared Task
---
This is a copy of the [MS^2](https://huggingface.co/datasets/allenai/mslr2022) dataset, except the input source documents of its `validation` split have been replaced by a __dense__ retriever. The retrieval pipeline used:
- __query__: The `background` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits. A document is the concatenation of the `title` and `abstract`.
- __retriever__: [`facebook/contriever-msmarco`](https://huggingface.co/facebook/contriever-msmarco) via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"max"`, i.e. the number of documents retrieved, `k`, is set as the maximum number of documents seen across examples in this dataset, in this case `k==25`
Retrieval results on the `train` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.4764 | 0.2395 | 0.1932 | 0.2895 |
Retrieval results on the `validation` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.4364 | 0.2125 | 0.1823 | 0.2524 |
Retrieval results on the `test` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.4481 | 0.2224 | 0.1943 | 0.2567 | |
ParallelnoMinded/promo_squad_ru | ---
license: apache-2.0
---
|
kailasv/ArtWhisperer | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: user_id
dtype: string
- name: target_id
dtype: string
- name: target_image
dtype: image
- name: target_positive_prompt
dtype: string
- name: target_negative_prompt
dtype: string
- name: target_image_embedding
sequence:
- name: value
dtype: float32
- name: target_positive_text_embedding
sequence:
- name: value
dtype: float32
- name: target_negative_text_embedding
sequence:
- name: value
dtype: float32
- name: Famous person?
dtype: bool
- name: Famous landmark?
dtype: bool
- name: Manmade?
dtype: bool
- name: People?
dtype: bool
- name: Real image?
dtype: bool
- name: AI image?
dtype: bool
- name: Art?
dtype: bool
- name: Nature?
dtype: bool
- name: City?
dtype: bool
- name: Fantasy?
dtype: bool
- name: Sci-fi or space?
dtype: bool
- name: generated_image
dtype: image
- name: generated_positive_prompt
dtype: string
- name: generated_negative_prompt
dtype: string
- name: generated_image_embedding
sequence:
- name: value
dtype: float32
- name: generated_positive_text_embedding
sequence:
- name: value
dtype: float32
- name: generated_negative_text_embedding
sequence:
- name: value
dtype: float32
- name: ai_model_name
dtype: string
- name: trajectory_index
dtype: int32
- name: score
dtype: int32
- name: human_rating
dtype: float32
- name: time_taken
dtype: duration[s]
- name: filtered_image
dtype: bool
splits:
- name: train
num_bytes: 5743017316.686
num_examples: 51026
- name: validation
num_bytes: 475257048.94
num_examples: 4572
download_size: 2185134483
dataset_size: 6218274365.625999
---
|
A2H0H0R1/Animal-nutrition-pair | ---
dataset_info:
features:
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
splits:
- name: train
num_bytes: 10252777
num_examples: 5027
download_size: 4012629
dataset_size: 10252777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
blanchon/FAIR1M_Small_Caption | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 5528315890.896
num_examples: 22312
download_size: 5560660833
dataset_size: 5528315890.896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sdmattpotter/hftest61223 | ---
license: mit
---
|
DjSteker/dataset_ham_spam | ---
dataset_info:
features:
- name: IsSpam
struct:
- name: '0'
dtype: string
- name: Text
struct:
- name: '0'
dtype: string
splits:
- name: train
num_bytes: 385
num_examples: 1
download_size: 3829
dataset_size: 385
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ziozzang/deepl-trans-IT-KO | ---
task_categories:
- translation
language:
- ko
- it
---
This dataset is some wikipedia article with DeepL translation, auto-aggregated.
# String/Corpus pairs
From IT/Italian to KO/Korean.
# Quality Filtering
- Stripping whole HTML tags.
- removed references and annotation mark.
- Filtered by string length.
---
The strings/corpus are aggregated from wikipedia(pt) using DeepL translated.
whole data collected by Jioh L. Jung<ziozzang@gmail.com>
license: mit
--- |
M-A-D/Mixed-Arabic-Dataset-Main | ---
language:
- ar
task_categories:
- conversational
- text-generation
- text2text-generation
- translation
- summarization
pretty_name: MAD
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: GenId
dtype: int64
- name: SubId
dtype: int64
- name: DatasetName
dtype: string
- name: DatasetLink
dtype: string
- name: Text
dtype: string
- name: MetaData
struct:
- name: AboutAuthor
dtype: string
- name: AboutBook
dtype: string
- name: Author
dtype: string
- name: AuthorName
dtype: string
- name: BookLink
dtype: string
- name: BookName
dtype: string
- name: ChapterLink
dtype: string
- name: ChapterName
dtype: string
- name: Tags
dtype: float64
- name: __index_level_0__
dtype: float64
- name: created_date
dtype: string
- name: deleted
dtype: bool
- name: detoxify
dtype: 'null'
- name: emojis
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: id
dtype: string
- name: labels
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: value
sequence: float64
- name: lang
dtype: string
- name: message_id
dtype: string
- name: message_tree_id
dtype: string
- name: model_name
dtype: 'null'
- name: parent_id
dtype: string
- name: query_id
dtype: string
- name: rank
dtype: float64
- name: review_count
dtype: float64
- name: review_result
dtype: bool
- name: role
dtype: string
- name: synthetic
dtype: bool
- name: title
dtype: string
- name: tree_state
dtype: string
- name: url
dtype: string
- name: user_id
dtype: string
- name: ConcatenatedText
dtype: int64
- name: __index_level_0__
dtype: float64
splits:
- name: train
num_bytes: 1990497610
num_examples: 131393
download_size: 790648134
dataset_size: 1990497610
---
# Dataset Card for "Mixed-Arabic-Dataset"
## Mixed Arabic Datasets (MAD)
The Mixed Arabic Datasets (MAD) project provides a comprehensive collection of diverse Arabic-language datasets, sourced from various repositories, platforms, and domains. These datasets cover a wide range of text types, including books, articles, Wikipedia content, stories, and more.
### MAD Repo vs. MAD Main
#### MAD Repo
- **Versatility**: In the MAD Repository (MAD Repo), datasets are made available in their original, native form. Researchers and practitioners can selectively download specific datasets that align with their specific interests or requirements.
- **Independent Access**: Each dataset is self-contained, enabling users to work with individual datasets independently, allowing for focused analyses and experiments.
#### MAD Main or simply MAD
- **Unified Dataframe**: MAD Main represents a harmonized and unified dataframe, incorporating all datasets from the MAD Repository. It provides a seamless and consolidated view of the entire MAD collection, making it convenient for comprehensive analyses and applications.
- **Holistic Perspective**: Researchers can access a broad spectrum of Arabic-language content within a single dataframe, promoting holistic exploration and insights across diverse text sources.
### Why MAD Main?
- **Efficiency**: Working with MAD Main streamlines the data acquisition process by consolidating multiple datasets into one structured dataframe. This is particularly beneficial for large-scale projects or studies requiring diverse data sources.
- **Interoperability**: With MAD Main, the datasets are integrated into a standardized format, enhancing interoperability and compatibility with a wide range of data processing and analysis tools.
- **Meta-Analysis**: Researchers can conduct comprehensive analyses, such as cross-domain studies, trend analyses, or comparative studies, by leveraging the combined richness of all MAD datasets.
### Getting Started
- To access individual datasets in their original form, refer to the MAD Repository ([Link to MAD Repo](https://huggingface.co/datasets/M-A-D/Mixed-Arabic-Datasets-Repo)).
- For a unified view of all datasets, conveniently organized in a dataframe, you are here in the right place.
```python
from datasets import load_dataset
dataset = load_dataset("M-A-D/Mixed-Arabic-Dataset-Main")
```
### Join Us on Discord
For discussions, contributions, and community interactions, join us on Discord! [](https://discord.gg/2NpJ9JGm)
### How to Contribute
Want to contribute to the Mixed Arabic Datasets project? Follow our comprehensive guide on Google Colab for step-by-step instructions: [Contribution Guide](https://colab.research.google.com/drive/1w7_7lL6w7nM9DcDmTZe1Vfiwkio6SA-w?usp=sharing).
**Note**: If you'd like to test a contribution before submitting it, feel free to do so on the [MAD Test Dataset](https://huggingface.co/datasets/M-A-D/Mixed-Arabic-Dataset-test).
## Citation
```
@dataset{
title = {Mixed Arabic Datasets (MAD)},
author = {MAD Community},
howpublished = {Dataset},
url = {https://huggingface.co/datasets/M-A-D/Mixed-Arabic-Datasets-Repo},
year = {2023},
}
``` |
ImruQays/Alukah-Arabic | ---
language:
- ar
license: cc-by-4.0
---
# Introduction
This dataset is a comprehensive collection of articles sourced from the Alukah website, a renowned platform offering extensive content primarily in Arabic. Alukah is known for its high-quality Arabic prose, significantly surpassing the standard found in contemporary media outlets. The majority of the articles are contributed by Muslim scholars, encompassing a wide range of topics related to Islam and the Muslim community. The dataset also includes a valuable section on fatwas, which could be instrumental in developing question-answer datasets for Islamic jurisprudence.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Language(s) (NLP):** [Arabic, minor content in other languages]
- **License:** [Refer to [Alukah terms of use](https://www.alukah.net/pages/terms_of_use.aspx)]
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Website:** [https://www.alukah.net/]
## Uses
The Alukah Arabic Articles Collection is particularly suitable for training large language models (LLMs) in Arabic. It offers a refined variant of the language that stands in contrast to the more commonly found less sophisticated forms in modern media. This dataset is an invaluable resource for:
- Language Model Training: Enriching LLMs with high-quality Arabic data, enhancing their understanding and generation capabilities in the language.
- Islamic Content Analysis: Providing a rich source of Islamic scholarly articles for research and analysis in religious studies, cultural studies, and linguistics.
- Historical and Cultural Research: The dataset can be used as a reference for studying the evolution of Arabic language usage in scholarly contexts.
## Dataset Structure
The dataset is organized into 9 files, each representing a distinct section of the Alukah website. It is important to note the potential for duplicate articles across these files, as some topics may overlap.
## Quality of Arabic Writing
While the articles on Alukah showcase a superior level of Arabic compared to contemporary writings, it's important to acknowledge that even these articles may not fully match the exemplary standards of classical Arabic literature. For enthusiasts and researchers aiming to explore the pinnacle of Arabic literary excellence, it is recommended to refer to works that are over 200 years old or consult the book "العرنجية" for further insights into the nuances of high-quality Arabic prose. |
ehcalabres/ravdess_speech | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- audio-classification
task_ids:
- speech-emotion-recognition
---
# Dataset Card for ravdess_speech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** https://zenodo.org/record/1188976#.YUS4MrozZdS
- **Paper:** https://doi.org/10.1371/journal.pone.0196391
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** ravdess@gmail.com
### Dataset Summary
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) contains 24 professional actors (12 female, 12 male), vocalizing two lexically-matched statements in a neutral North American accent. Speech includes calm, happy, sad, angry, fearful, surprise, and disgust expressions. Each expression is produced at two levels of emotional intensity (normal, strong), with an additional neutral expression. The conditions of the audio files are: 16bit, 48kHz .wav.
### Supported Tasks and Leaderboards
- audio-classification: The dataset can be used to train a model for Audio Classification tasks, which consists in predict the latent emotion presented on the audios.
### Languages
The audios available in the dataset are in English spoken by actors in a neutral North American accent.
## Dataset Structure
### Data Instances
[Needs More Information]
### Data Fields
[Needs More Information]
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
The RAVDESS is released under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, CC BY-NC-SA 4.0
Commercial licenses for the RAVDESS can also be purchased. For more information, please visit our license fee page, or contact us at ravdess@gmail.com.
### Citation Information
Livingstone SR, Russo FA (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE 13(5): e0196391. https://doi.org/10.1371/journal.pone.0196391. |
CountFloyd/bark-german-semantic-wav-training | ---
language:
- de
--- |
AdapterOcean/data-standardized_cluster_23_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7951640
num_examples: 6750
download_size: 3438336
dataset_size: 7951640
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_23_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indiejoseph/wikipedia-en-filtered | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 49741517
num_examples: 17260
download_size: 27011805
dataset_size: 49741517
language:
- en
---
# Dataset Card for "wikipedia-en-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_0 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1165136164
num_examples: 228817
download_size: 1184952464
dataset_size: 1165136164
---
# Dataset Card for "chunk_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PDBEurope/protein_chain_conformational_states | ---
license: cc-by-4.0
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- feature-extraction
tags:
- Structural biology
- Bioinformatics
- Machine learning
- Conformation
- Conformational state
- Monomeric
- Training data
- Benchmark
- Manually curated
pretty_name: Curated dataset of protein chain conformational states
---
## Schema description:
The manually curated dataset of open-closed monomers is included here as `benchmarking_monomeric_open_closed_conformers.csv`.
Column descriptions:
## Schema description:
The manually curated dataset of open-closed monomers is included here as `benchmarking_monomeric_open_closed_conformers.csv`.
Column descriptions:
- **`UNP_ACC`** | UniProt accession code
- **`UNP_START`** | Start of UniProt sequence for given PDBe entries
- **`UNP_END`** | End of UniProt sequence for given PDBe entries
- **`PDBe_ID`** | Protein Data Bank code
- **`CHAIN_ID`** | Author declared chain ID (`char`)
- **`label_asym_id`** | Programmatically assigned chain ID (`char`)
- **`CONFORMER_ID`** | Unique code for PDBe entries with distinct conformation, corresponding to a given UniProt accession
- **`CONFORMER_DESCR`** | Short description of conformation, based on depositor's assessment of the protein/conformation
- **`LIT_CONFIRMED`** | True/false value based on whether a publication (scientific literature) was available for manually curating clusters. NB: Clusters with 0 in this field should be used with caution.
- **`ALT_CONFORMER_ID`** | Where the publication for a structure is currently outstanding, an executive decision on the conformation classification is made. Where the literature is not explicit on the features of a given conformation, the second most suitable `CONFORMER_ID` is provided in this column. Blank cells have no other likely conformation assignmnt and are therefore the same as in `CONFORMER_ID`.
- **`ALT_CONFORMER_DESCR`** | Description for conformation in alternative conformation ID.
## Curation process
As of 09 Mar 2022, a manually curated dataset of monomeric protein conformations has been collated, containing 'open'-'closed' pairs as well as intermediary states defined by the authors of the entry.
1. The PDBe was queried, through its Oracle DB, to find PDBe entries with 100 % sequence identity for a UniProt segment in both 'open' and 'closed' conformations, as stated in the entry's `TITLE` field. The query used:
```
select b.accession, b.unp_start, b.unp_end, a.id, a.title, d.id, d.title
from entry a, unp_entity b, unp_entity c, entry d, pdb_assembly e
where a.title like ‘%open%’ and d.title like ‘%close%’
and a.id = b.entry_id and d.id = c.entry_id and a.id != d.id
and b.accession = c.accession
and b.unp_start = c.unp_start
and b.unp_end = c.unp_end
and a.id = e.entry_id
and e.type = ‘homo’
and e.name = ‘monomer’
```
was written by Dr Sameer Velankar.
2. These results were cleaned to remove entries with 'open' or 'close' substrings in their `TITLE` fields that did not refer to conformation. The 'open' substring often appeared in ligand names in the entries' `TITLE` field, such as in *dichlorido(1,3-dimethylbenzimidaz ol-2-ylidene)(eta5-pentamethylcycl**open**tadienyl)rhodium(III)* and 'close' in terms like *dis**close**s*.
3. All remaining entries were then manually curated by investigating each PDBe entry's corresponding publication, where available.
1. Additional PDBe entries submitted by the authors, which were missed in the original search due to a lack of 'open' or 'close' substrings in their `TITLE` field but stated as fitting one of the states in the publication, were added.
2. For some UniProt accessions, intermediary conformations were reported by the authors and these were noted in the dataset under the `CONFORMER_DESCR` column.
3. Entries deposited in monomeric form but solved as a multimeric complex were also removed.
4. PDBe entries, now clustered by author-stated conformation, were cross-referenced against the PDBe-KB's existing clustering algorithm (available on the [Aggregate Views of Proteins](https://www.ebi.ac.uk/pdbe/pdbe-kb/protein) page) to assess current conformer clustering success. These results are currently awaiting publication.
### Curation process outline
<img src="http://ftp.ebi.ac.uk/pub/databases/pdbe-kb/benchmarking/distinct-monomer-conformers/work_progress_flowdiagram_200pc.png" alt="Curation flow diagram">
### Dataset summary
<img src="http://ftp.ebi.ac.uk/pub/databases/pdbe-kb/benchmarking/distinct-monomer-conformers/summary_data_visualisation.png" alt="Benchmark dataset summary graphs"> |
Back-up/facebook_comment_augmentation-v2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: hash
dtype: int64
splits:
- name: train
num_bytes: 213595318.3206353
num_examples: 1328546
download_size: 119399417
dataset_size: 213595318.3206353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tohrumi/iwslt15_fuzzy_1000_train_samples | ---
dataset_info:
features:
- name: id
dtype: int64
- name: translation
struct:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 636042.8958904734
num_examples: 1000
- name: test
num_bytes: 329197
num_examples: 1268
download_size: 574472
dataset_size: 965239.8958904734
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
anhtu12st/papers | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 45435
num_examples: 155
download_size: 28085
dataset_size: 45435
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "papers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.all_22 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30475531164.0
num_examples: 267733
download_size: 30239263328
dataset_size: 30475531164.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
mlabonne/truthy-dpo-v0.1 | ---
language:
- en
dataset_info:
features:
- name: id
dtype: string
- name: source
dtype: string
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1344072
num_examples: 1016
download_size: 652993
dataset_size: 1344072
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fhaddad/autotrain-data-fhdd_arabic_chatbot | ---
language:
- en
- ar
task_categories:
- translation
---
# AutoTrain Dataset for project: fhdd_arabic_chatbot
## Dataset Description
This dataset has been automatically processed by AutoTrain for project fhdd_arabic_chatbot.
### Languages
The BCP-47 code for the dataset's language is en2ar.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_sourceLang": "ara",
"feat_targetlang": "eng",
"target": "\u064a\u0646\u0628\u063a\u064a \u0623\u0646 \u062a\u064f\u0638\u0647\u0631 \u0627\u0644\u0646\u0651\u0633\u0627\u0621 \u0648\u062c\u0648\u0647\u0647\u0646\u0651.",
"source": "Women should have their faces visible."
},
{
"feat_sourceLang": "ara",
"feat_targetlang": "eng",
"target": "\u0623\u062a\u062f\u0631\u0633 \u0627\u0644\u0625\u0646\u062c\u0644\u064a\u0632\u064a\u0629\u061f",
"source": "Do you study English?"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_sourceLang": "Value(dtype='string', id=None)",
"feat_targetlang": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)",
"source": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 15622 |
| valid | 3906 |
|
anan-2024/twitter_dataset_1713142048 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 192326
num_examples: 521
download_size: 106584
dataset_size: 192326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
datasets-examples/doc-formats-csv-3 | ---
configs:
- config_name: default
data_files: "data.csv"
delimiter: "|"
header: 1
names: ["kind", "sound"]
size_categories:
- n<1K
---
# [doc] formats - csv - 3
This dataset contains one csv file at the root:
- [data.csv](./data.csv)
```csv
# ignored comment
col1|col2
dog|woof
cat|meow
pokemon|pika
human|hello
```
We define the config name in the YAML config, as well as the exact location of the file, the separator as `"|"`, the name of the columns, and the number of rows to ignore (the row #1 is a row of column headers, that will be replaced by the `names` option, and the row #0 is ignored). The reference for the options is the [documentation of pandas.read_csv()](https://pandas.pydata.org/docs/reference/api/pandas.read_csv.html).
```yaml
---
configs:
- config_name: default
data_files: "data.csv"
delimiter: "|"
header: 1
names: ["kind", "sound"]
size_categories:
- n<1K
---
```
|
fathyshalab/massive_social-de-DE | ---
dataset_info:
features:
- name: id
dtype: string
- name: locale
dtype: string
- name: partition
dtype: string
- name: scenario
dtype:
class_label:
names:
'0': social
'1': transport
'2': calendar
'3': play
'4': news
'5': datetime
'6': recommendation
'7': email
'8': iot
'9': general
'10': audio
'11': lists
'12': qa
'13': cooking
'14': takeaway
'15': music
'16': alarm
'17': weather
- name: intent
dtype:
class_label:
names:
'0': datetime_query
'1': iot_hue_lightchange
'2': transport_ticket
'3': takeaway_query
'4': qa_stock
'5': general_greet
'6': recommendation_events
'7': music_dislikeness
'8': iot_wemo_off
'9': cooking_recipe
'10': qa_currency
'11': transport_traffic
'12': general_quirky
'13': weather_query
'14': audio_volume_up
'15': email_addcontact
'16': takeaway_order
'17': email_querycontact
'18': iot_hue_lightup
'19': recommendation_locations
'20': play_audiobook
'21': lists_createoradd
'22': news_query
'23': alarm_query
'24': iot_wemo_on
'25': general_joke
'26': qa_definition
'27': social_query
'28': music_settings
'29': audio_volume_other
'30': calendar_remove
'31': iot_hue_lightdim
'32': calendar_query
'33': email_sendemail
'34': iot_cleaning
'35': audio_volume_down
'36': play_radio
'37': cooking_query
'38': datetime_convert
'39': qa_maths
'40': iot_hue_lightoff
'41': iot_hue_lighton
'42': transport_query
'43': music_likeness
'44': email_query
'45': play_music
'46': audio_volume_mute
'47': social_post
'48': alarm_set
'49': qa_factoid
'50': calendar_set
'51': play_game
'52': alarm_remove
'53': lists_remove
'54': transport_taxi
'55': recommendation_movies
'56': iot_coffee
'57': music_query
'58': play_podcasts
'59': lists_query
- name: text
dtype: string
- name: annot_utt
dtype: string
- name: worker_id
dtype: string
- name: slot_method
sequence:
- name: slot
dtype: string
- name: method
dtype: string
- name: judgments
sequence:
- name: worker_id
dtype: string
- name: intent_score
dtype: int8
- name: slots_score
dtype: int8
- name: grammar_score
dtype: int8
- name: spelling_score
dtype: int8
- name: language_identification
dtype: string
- name: label_name
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 129790
num_examples: 391
- name: validation
num_bytes: 22472
num_examples: 68
- name: test
num_bytes: 34107
num_examples: 106
download_size: 70328
dataset_size: 186369
---
# Dataset Card for "massive_social-de-DE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/thematic2aembed | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 691745022
num_examples: 908625
download_size: 205106568
dataset_size: 691745022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gilsonk12/Rastreando | ---
license: openrail
---
|
shreyasharma/sentence_eval_aa2 | ---
dataset_info:
features:
- name: declarativized
dtype: string
- name: correct
dtype: bool
splits:
- name: train
num_bytes: 35463
num_examples: 615
- name: validation
num_bytes: 18279
num_examples: 315
- name: test
num_bytes: 17185
num_examples: 300
download_size: 56380
dataset_size: 70927
---
# Dataset Card for "sentence_eval_aa2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RikeshSilwal/nepali_corpora | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1725683703
num_examples: 3253409
download_size: 496020433
dataset_size: 1725683703
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yankihue/tweets-turkish | ---
language:
- tr
--- |
taspecustu/Nanachi | ---
license: cc-by-4.0
---
|
ikala/tmmluplus | ---
license: other
license_name: creative-commons-by-nc
task_categories:
- question-answering
language:
- zh
tags:
- traditional chinese
- finance
- medical
- taiwan
- benchmark
- zh-tw
- zh-hant
pretty_name: tmmlu++
size_categories:
- 100K<n<1M
configs:
- config_name: engineering_math
data_files:
- split: train
path: "data/engineering_math_dev.csv"
- split: validation
path: "data/engineering_math_val.csv"
- split: test
path: "data/engineering_math_test.csv"
- config_name: dentistry
data_files:
- split: train
path: "data/dentistry_dev.csv"
- split: validation
path: "data/dentistry_val.csv"
- split: test
path: "data/dentistry_test.csv"
- config_name: traditional_chinese_medicine_clinical_medicine
data_files:
- split: train
path: "data/traditional_chinese_medicine_clinical_medicine_dev.csv"
- split: validation
path: "data/traditional_chinese_medicine_clinical_medicine_val.csv"
- split: test
path: "data/traditional_chinese_medicine_clinical_medicine_test.csv"
- config_name: clinical_psychology
data_files:
- split: train
path: "data/clinical_psychology_dev.csv"
- split: validation
path: "data/clinical_psychology_val.csv"
- split: test
path: "data/clinical_psychology_test.csv"
- config_name: technical
data_files:
- split: train
path: "data/technical_dev.csv"
- split: validation
path: "data/technical_val.csv"
- split: test
path: "data/technical_test.csv"
- config_name: culinary_skills
data_files:
- split: train
path: "data/culinary_skills_dev.csv"
- split: validation
path: "data/culinary_skills_val.csv"
- split: test
path: "data/culinary_skills_test.csv"
- config_name: mechanical
data_files:
- split: train
path: "data/mechanical_dev.csv"
- split: validation
path: "data/mechanical_val.csv"
- split: test
path: "data/mechanical_test.csv"
- config_name: logic_reasoning
data_files:
- split: train
path: "data/logic_reasoning_dev.csv"
- split: validation
path: "data/logic_reasoning_val.csv"
- split: test
path: "data/logic_reasoning_test.csv"
- config_name: real_estate
data_files:
- split: train
path: "data/real_estate_dev.csv"
- split: validation
path: "data/real_estate_val.csv"
- split: test
path: "data/real_estate_test.csv"
- config_name: general_principles_of_law
data_files:
- split: train
path: "data/general_principles_of_law_dev.csv"
- split: validation
path: "data/general_principles_of_law_val.csv"
- split: test
path: "data/general_principles_of_law_test.csv"
- config_name: finance_banking
data_files:
- split: train
path: "data/finance_banking_dev.csv"
- split: validation
path: "data/finance_banking_val.csv"
- split: test
path: "data/finance_banking_test.csv"
- config_name: anti_money_laundering
data_files:
- split: train
path: "data/anti_money_laundering_dev.csv"
- split: validation
path: "data/anti_money_laundering_val.csv"
- split: test
path: "data/anti_money_laundering_test.csv"
- config_name: ttqav2
data_files:
- split: train
path: "data/ttqav2_dev.csv"
- split: validation
path: "data/ttqav2_val.csv"
- split: test
path: "data/ttqav2_test.csv"
- config_name: marketing_management
data_files:
- split: train
path: "data/marketing_management_dev.csv"
- split: validation
path: "data/marketing_management_val.csv"
- split: test
path: "data/marketing_management_test.csv"
- config_name: business_management
data_files:
- split: train
path: "data/business_management_dev.csv"
- split: validation
path: "data/business_management_val.csv"
- split: test
path: "data/business_management_test.csv"
- config_name: organic_chemistry
data_files:
- split: train
path: "data/organic_chemistry_dev.csv"
- split: validation
path: "data/organic_chemistry_val.csv"
- split: test
path: "data/organic_chemistry_test.csv"
- config_name: advance_chemistry
data_files:
- split: train
path: "data/advance_chemistry_dev.csv"
- split: validation
path: "data/advance_chemistry_val.csv"
- split: test
path: "data/advance_chemistry_test.csv"
- config_name: physics
data_files:
- split: train
path: "data/physics_dev.csv"
- split: validation
path: "data/physics_val.csv"
- split: test
path: "data/physics_test.csv"
- config_name: secondary_physics
data_files:
- split: train
path: "data/secondary_physics_dev.csv"
- split: validation
path: "data/secondary_physics_val.csv"
- split: test
path: "data/secondary_physics_test.csv"
- config_name: human_behavior
data_files:
- split: train
path: "data/human_behavior_dev.csv"
- split: validation
path: "data/human_behavior_val.csv"
- split: test
path: "data/human_behavior_test.csv"
- config_name: national_protection
data_files:
- split: train
path: "data/national_protection_dev.csv"
- split: validation
path: "data/national_protection_val.csv"
- split: test
path: "data/national_protection_test.csv"
- config_name: jce_humanities
data_files:
- split: train
path: "data/jce_humanities_dev.csv"
- split: validation
path: "data/jce_humanities_val.csv"
- split: test
path: "data/jce_humanities_test.csv"
- config_name: politic_science
data_files:
- split: train
path: "data/politic_science_dev.csv"
- split: validation
path: "data/politic_science_val.csv"
- split: test
path: "data/politic_science_test.csv"
- config_name: agriculture
data_files:
- split: train
path: "data/agriculture_dev.csv"
- split: validation
path: "data/agriculture_val.csv"
- split: test
path: "data/agriculture_test.csv"
- config_name: official_document_management
data_files:
- split: train
path: "data/official_document_management_dev.csv"
- split: validation
path: "data/official_document_management_val.csv"
- split: test
path: "data/official_document_management_test.csv"
- config_name: financial_analysis
data_files:
- split: train
path: "data/financial_analysis_dev.csv"
- split: validation
path: "data/financial_analysis_val.csv"
- split: test
path: "data/financial_analysis_test.csv"
- config_name: pharmacy
data_files:
- split: train
path: "data/pharmacy_dev.csv"
- split: validation
path: "data/pharmacy_val.csv"
- split: test
path: "data/pharmacy_test.csv"
- config_name: educational_psychology
data_files:
- split: train
path: "data/educational_psychology_dev.csv"
- split: validation
path: "data/educational_psychology_val.csv"
- split: test
path: "data/educational_psychology_test.csv"
- config_name: statistics_and_machine_learning
data_files:
- split: train
path: "data/statistics_and_machine_learning_dev.csv"
- split: validation
path: "data/statistics_and_machine_learning_val.csv"
- split: test
path: "data/statistics_and_machine_learning_test.csv"
- config_name: management_accounting
data_files:
- split: train
path: "data/management_accounting_dev.csv"
- split: validation
path: "data/management_accounting_val.csv"
- split: test
path: "data/management_accounting_test.csv"
- config_name: introduction_to_law
data_files:
- split: train
path: "data/introduction_to_law_dev.csv"
- split: validation
path: "data/introduction_to_law_val.csv"
- split: test
path: "data/introduction_to_law_test.csv"
- config_name: computer_science
data_files:
- split: train
path: "data/computer_science_dev.csv"
- split: validation
path: "data/computer_science_val.csv"
- split: test
path: "data/computer_science_test.csv"
- config_name: veterinary_pathology
data_files:
- split: train
path: "data/veterinary_pathology_dev.csv"
- split: validation
path: "data/veterinary_pathology_val.csv"
- split: test
path: "data/veterinary_pathology_test.csv"
- config_name: accounting
data_files:
- split: train
path: "data/accounting_dev.csv"
- split: validation
path: "data/accounting_val.csv"
- split: test
path: "data/accounting_test.csv"
- config_name: fire_science
data_files:
- split: train
path: "data/fire_science_dev.csv"
- split: validation
path: "data/fire_science_val.csv"
- split: test
path: "data/fire_science_test.csv"
- config_name: optometry
data_files:
- split: train
path: "data/optometry_dev.csv"
- split: validation
path: "data/optometry_val.csv"
- split: test
path: "data/optometry_test.csv"
- config_name: insurance_studies
data_files:
- split: train
path: "data/insurance_studies_dev.csv"
- split: validation
path: "data/insurance_studies_val.csv"
- split: test
path: "data/insurance_studies_test.csv"
- config_name: pharmacology
data_files:
- split: train
path: "data/pharmacology_dev.csv"
- split: validation
path: "data/pharmacology_val.csv"
- split: test
path: "data/pharmacology_test.csv"
- config_name: taxation
data_files:
- split: train
path: "data/taxation_dev.csv"
- split: validation
path: "data/taxation_val.csv"
- split: test
path: "data/taxation_test.csv"
- config_name: trust_practice
data_files:
- split: train
path: "data/trust_practice_dev.csv"
- split: validation
path: "data/trust_practice_val.csv"
- split: test
path: "data/trust_practice_test.csv"
- config_name: geography_of_taiwan
data_files:
- split: train
path: "data/geography_of_taiwan_dev.csv"
- split: validation
path: "data/geography_of_taiwan_val.csv"
- split: test
path: "data/geography_of_taiwan_test.csv"
- config_name: physical_education
data_files:
- split: train
path: "data/physical_education_dev.csv"
- split: validation
path: "data/physical_education_val.csv"
- split: test
path: "data/physical_education_test.csv"
- config_name: auditing
data_files:
- split: train
path: "data/auditing_dev.csv"
- split: validation
path: "data/auditing_val.csv"
- split: test
path: "data/auditing_test.csv"
- config_name: administrative_law
data_files:
- split: train
path: "data/administrative_law_dev.csv"
- split: validation
path: "data/administrative_law_val.csv"
- split: test
path: "data/administrative_law_test.csv"
- config_name: education_(profession_level)
data_files:
- split: train
path: "data/education_(profession_level)_dev.csv"
- split: validation
path: "data/education_(profession_level)_val.csv"
- split: test
path: "data/education_(profession_level)_test.csv"
- config_name: economics
data_files:
- split: train
path: "data/economics_dev.csv"
- split: validation
path: "data/economics_val.csv"
- split: test
path: "data/economics_test.csv"
- config_name: veterinary_pharmacology
data_files:
- split: train
path: "data/veterinary_pharmacology_dev.csv"
- split: validation
path: "data/veterinary_pharmacology_val.csv"
- split: test
path: "data/veterinary_pharmacology_test.csv"
- config_name: nautical_science
data_files:
- split: train
path: "data/nautical_science_dev.csv"
- split: validation
path: "data/nautical_science_val.csv"
- split: test
path: "data/nautical_science_test.csv"
- config_name: occupational_therapy_for_psychological_disorders
data_files:
- split: train
path: "data/occupational_therapy_for_psychological_disorders_dev.csv"
- split: validation
path: "data/occupational_therapy_for_psychological_disorders_val.csv"
- split: test
path: "data/occupational_therapy_for_psychological_disorders_test.csv"
- config_name: basic_medical_science
data_files:
- split: train
path: "data/basic_medical_science_dev.csv"
- split: validation
path: "data/basic_medical_science_val.csv"
- split: test
path: "data/basic_medical_science_test.csv"
- config_name: macroeconomics
data_files:
- split: train
path: "data/macroeconomics_dev.csv"
- split: validation
path: "data/macroeconomics_val.csv"
- split: test
path: "data/macroeconomics_test.csv"
- config_name: trade
data_files:
- split: train
path: "data/trade_dev.csv"
- split: validation
path: "data/trade_val.csv"
- split: test
path: "data/trade_test.csv"
- config_name: chinese_language_and_literature
data_files:
- split: train
path: "data/chinese_language_and_literature_dev.csv"
- split: validation
path: "data/chinese_language_and_literature_val.csv"
- split: test
path: "data/chinese_language_and_literature_test.csv"
- config_name: tve_design
data_files:
- split: train
path: "data/tve_design_dev.csv"
- split: validation
path: "data/tve_design_val.csv"
- split: test
path: "data/tve_design_test.csv"
- config_name: junior_science_exam
data_files:
- split: train
path: "data/junior_science_exam_dev.csv"
- split: validation
path: "data/junior_science_exam_val.csv"
- split: test
path: "data/junior_science_exam_test.csv"
- config_name: junior_math_exam
data_files:
- split: train
path: "data/junior_math_exam_dev.csv"
- split: validation
path: "data/junior_math_exam_val.csv"
- split: test
path: "data/junior_math_exam_test.csv"
- config_name: junior_chinese_exam
data_files:
- split: train
path: "data/junior_chinese_exam_dev.csv"
- split: validation
path: "data/junior_chinese_exam_val.csv"
- split: test
path: "data/junior_chinese_exam_test.csv"
- config_name: junior_social_studies
data_files:
- split: train
path: "data/junior_social_studies_dev.csv"
- split: validation
path: "data/junior_social_studies_val.csv"
- split: test
path: "data/junior_social_studies_test.csv"
- config_name: tve_mathematics
data_files:
- split: train
path: "data/tve_mathematics_dev.csv"
- split: validation
path: "data/tve_mathematics_val.csv"
- split: test
path: "data/tve_mathematics_test.csv"
- config_name: tve_chinese_language
data_files:
- split: train
path: "data/tve_chinese_language_dev.csv"
- split: validation
path: "data/tve_chinese_language_val.csv"
- split: test
path: "data/tve_chinese_language_test.csv"
- config_name: tve_natural_sciences
data_files:
- split: train
path: "data/tve_natural_sciences_dev.csv"
- split: validation
path: "data/tve_natural_sciences_val.csv"
- split: test
path: "data/tve_natural_sciences_test.csv"
- config_name: junior_chemistry
data_files:
- split: train
path: "data/junior_chemistry_dev.csv"
- split: validation
path: "data/junior_chemistry_val.csv"
- split: test
path: "data/junior_chemistry_test.csv"
- config_name: music
data_files:
- split: train
path: "data/music_dev.csv"
- split: validation
path: "data/music_val.csv"
- split: test
path: "data/music_test.csv"
- config_name: education
data_files:
- split: train
path: "data/education_dev.csv"
- split: validation
path: "data/education_val.csv"
- split: test
path: "data/education_test.csv"
- config_name: three_principles_of_people
data_files:
- split: train
path: "data/three_principles_of_people_dev.csv"
- split: validation
path: "data/three_principles_of_people_val.csv"
- split: test
path: "data/three_principles_of_people_test.csv"
- config_name: taiwanese_hokkien
data_files:
- split: train
path: "data/taiwanese_hokkien_dev.csv"
- split: validation
path: "data/taiwanese_hokkien_val.csv"
- split: test
path: "data/taiwanese_hokkien_test.csv"
---
# TMMLU+ : Large scale traditional chinese massive multitask language understanding
<p align="center">
<img src="https://huggingface.co/datasets/ikala/tmmluplus/resolve/main/cover.png" alt="A close-up image of a neat paper note with a white background. The text 'TMMLU+' is written horizontally across the center of the note in bold, black. Join us to work in multimodal LLM : https://ikala.ai/recruit/" style="max-width: 400" width=400 />
</p>
We present TMMLU+, a traditional Chinese massive multitask language understanding dataset. TMMLU+ is a multiple-choice question-answering dataset featuring 66 subjects, ranging from elementary to professional level.
The TMMLU+ dataset is six times larger and contains more balanced subjects compared to its predecessor, [TMMLU](https://github.com/mtkresearch/MR-Models/tree/main/TC-Eval/data/TMMLU). We have included benchmark results in TMMLU+ from closed-source models and 20 open-weight Chinese large language models, with parameters ranging from 1.8B to 72B. The benchmark results show that Traditional Chinese variants still lag behind those trained on major Simplified Chinese models.
```python
from datasets import load_dataset
task_list = [
'engineering_math', 'dentistry', 'traditional_chinese_medicine_clinical_medicine', 'clinical_psychology', 'technical', 'culinary_skills', 'mechanical', 'logic_reasoning', 'real_estate',
'general_principles_of_law', 'finance_banking', 'anti_money_laundering', 'ttqav2', 'marketing_management', 'business_management', 'organic_chemistry', 'advance_chemistry',
'physics', 'secondary_physics', 'human_behavior', 'national_protection', 'jce_humanities', 'politic_science', 'agriculture', 'official_document_management',
'financial_analysis', 'pharmacy', 'educational_psychology', 'statistics_and_machine_learning', 'management_accounting', 'introduction_to_law', 'computer_science', 'veterinary_pathology',
'accounting', 'fire_science', 'optometry', 'insurance_studies', 'pharmacology', 'taxation', 'trust_practice', 'geography_of_taiwan', 'physical_education', 'auditing', 'administrative_law',
'education_(profession_level)', 'economics', 'veterinary_pharmacology', 'nautical_science', 'occupational_therapy_for_psychological_disorders',
'basic_medical_science', 'macroeconomics', 'trade', 'chinese_language_and_literature', 'tve_design', 'junior_science_exam', 'junior_math_exam', 'junior_chinese_exam',
'junior_social_studies', 'tve_mathematics', 'tve_chinese_language', 'tve_natural_sciences', 'junior_chemistry', 'music', 'education', 'three_principles_of_people',
'taiwanese_hokkien'
]
for task in task_list:
val = load_dataset('ikala/tmmluplus', task)['validation']
dev = load_dataset('ikala/tmmluplus', task)['train']
test = load_dataset('ikala/tmmluplus', task)['test']
```
For each dataset split
```python
for row in test:
print(row)
break
>> Dataset({
features: ['question', 'A', 'B', 'C', 'D', 'answer'],
num_rows: 11
})
```
Statistic on all four categories : STEM, Social Science, Humanities, Other
| Category | Test | Dev | Validation |
|----------------------------------|-------|------|------------|
| STEM | 3458 | 70 | 385 |
| Social Sciences | 5958 | 90 | 665 |
| Humanities | 1763 | 35 | 197 |
| Other (Business, Health, Misc.) | 8939 | 135 | 995 |
| **Total** | 20118 | 330 | 2242 |
## Benchmark on direct prompting
| model | STEM | Social Science | Humanities | Other | Average |
|------------|------------|------------|------------|------------|------------|
| [Qwen/Qwen-72B](https://huggingface.co/Qwen/Qwen-72B) | 61.12 | 71.65 | 63.00 | 61.31 |64.27|
| gpt-4-0613 | 60.36 | 67.36 | 56.03 | 57.62 |60.34|
| [Qwen/Qwen-72B-Chat](https://huggingface.co/Qwen/Qwen-72B-Chat) | 55.15 | 66.20 | 55.65 | 57.19 |58.55|
| [Qwen/Qwen-14B](https://huggingface.co/Qwen/Qwen-14B) | 46.94 | 56.69 | 49.43 | 48.81 |50.47|
| Gemini-pro | 45.38 | 57.29 | 48.80 | 48.21 |49.92|
| [01-ai/Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat) | 40.24 | 56.77 | 53.99 | 47.58 |49.64|
| [Qwen/Qwen-14B-Chat](https://huggingface.co/Qwen/Qwen-14B-Chat) | 43.86 | 53.29 | 44.78 | 45.13 |46.77|
| [01-ai/Yi-6B-Chat](https://huggingface.co/01-ai/Yi-6B-Chat) | 39.62 | 50.24 | 44.44 | 44.26 |44.64|
| Claude-1.3 | 42.65 | 49.33 | 42.16 | 44.14 |44.57|
| gpt-3.5-turbo-0613 | 41.56 | 46.72 | 36.73 | 42.03 |41.76|
| [CausalLM/14B](https://huggingface.co/CausalLM/14B) | 39.83 | 44.50 | 39.61 | 41.97 |41.48|
| [Skywork/Skywork-13B-base](https://huggingface.co/Skywork/Skywork-13B-base) | 36.93 | 47.27 | 41.04 | 40.10 |41.33|
| [Qwen/Qwen-7B](https://huggingface.co/Qwen/Qwen-7B) | 37.53 | 45.48 | 38.09 | 38.96 |40.01|
| [Qwen/Qwen-7B-Chat](https://huggingface.co/Qwen/Qwen-7B-Chat) | 33.32 | 44.64 | 40.27 | 39.89 |39.53|
| [vivo-ai/BlueLM-7B-Base](https://huggingface.co/vivo-ai/BlueLM-7B-Base) | 33.94 | 41.52 | 37.38 | 38.74 |37.90|
| [baichuan-inc/Baichuan2-13B-Chat](https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat) | 29.64 | 43.73 | 37.36 | 39.88 |37.65|
| [Qwen/Qwen-1_8B](https://huggingface.co/Qwen/Qwen-1_8B) | 32.65 | 38.95 | 38.34 | 35.27 |36.30|
| Claude-2 | 39.65 | 39.09 | 28.59 | 37.47 |36.20|
| [THUDM/chatglm3-6b](https://huggingface.co/THUDM/chatglm3-6b) | 31.05 | 39.31 | 35.64 | 35.60 |35.40|
| [deepseek-ai/deepseek-llm-7b-chat](https://huggingface.co/deepseek-ai/deepseek-llm-7b-chat) | 29.82 | 42.29 | 34.24 | 34.31 |35.17|
| [CausalLM/7B](https://huggingface.co/CausalLM/7B) | 31.03 | 38.17 | 35.87 | 35.39 |35.11|
| [Azure99/blossom-v3_1-mistral-7b](https://huggingface.co/Azure99/blossom-v3_1-mistral-7b) | 32.80 | 36.91 | 32.36 | 34.53 |34.15|
| [microsoft/Orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b) | 24.69 | 39.18 | 33.60 | 31.99 |32.37|
| [Qwen/Qwen-1_8B-Chat](https://huggingface.co/Qwen/Qwen-1_8B-Chat) | 26.60 | 36.36 | 31.81 | 31.96 |31.68|
| [TigerResearch/tigerbot-13b-chat-v3](https://huggingface.co/TigerResearch/tigerbot-13b-chat-v3) | 24.73 | 29.63 | 25.72 | 27.22 |26.82|
| [hongyin/mistral-7b-80k](https://huggingface.co/hongyin/mistral-7b-80k) | 24.26 | 23.76 | 22.56 | 24.57 |23.79|
| [deepseek-ai/deepseek-llm-67b-chat](https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat) | 19.10 | 26.06 | 21.51 | 21.77 |22.11|
| [yentinglin/Taiwan-LLM-13B-v2.0-chat](https://huggingface.co/yentinglin/Taiwan-LLM-13B-v2.0-chat) | 18.53 | 27.65 | 17.77 | 21.49 |21.36|
| [GeneZC/MiniChat-3B](https://huggingface.co/GeneZC/MiniChat-3B) | 17.66 | 23.35 | 22.71 | 20.34 |21.02|
| [LinkSoul/Chinese-Llama-2-7b](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b) | 16.55 | 18.39 | 12.97 | 16.13 |16.01|
| [yentinglin/Taiwan-LLM-7B-v2.1-chat](https://huggingface.co/yentinglin/Taiwan-LLM-7B-v2.1-chat) | 14.99 | 16.23 | 15.00 | 16.22 |15.61|
| Claude-instant-1 | 12.52 | 17.13 | 15.10 | 13.57 |14.58|
| [FlagAlpha/Atom-7B](https://huggingface.co/FlagAlpha/Atom-7B) | 5.60 | 13.57 | 7.71 | 11.84 |9.68|
Results via [ievals](https://github.com/iKala/ievals) ( settings : 0-shot direct answering )
# Citation
```
@article{ikala2024improved,
title={An Improved Traditional Chinese Evaluation Suite for Foundation Model},
author={Tam, Zhi-Rui and Pai, Ya-Ting and Lee, Yen-Wei and Cheng, Sega and Shuai, Hong-Han},
journal={arXiv preprint arXiv:2403.01858},
year={2024}
}
```
|
ealtan/MoodBooster | ---
license: mit
---
|
kheopss/lettre_admin_f1.0_regenerated | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
- name: text
dtype: string
- name: text2
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4462522
num_examples: 697
download_size: 1690790
dataset_size: 4462522
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vitaliy-sharandin/pollution-krakow-no2-co | ---
dataset_info:
features:
- name: NO2
dtype: float64
- name: CO
dtype: float64
- name: dt
dtype: timestamp[ns]
splits:
- name: train
num_bytes: 6816
num_examples: 284
download_size: 9084
dataset_size: 6816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pollution-krakow-no2-co"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
natnitaract/SciBench-TruthfulQA-RAG | ---
license: apache-2.0
task_categories:
- multiple-choice
--- |
Falah/portrait_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 22213518
num_examples: 100000
download_size: 2797158
dataset_size: 22213518
---
# Dataset Card for "portrait_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MohammedNasri/163762AASRnoDiacs | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 147266551704
num_examples: 153322
- name: test
num_bytes: 10027423544
num_examples: 10440
download_size: 23754042731
dataset_size: 157293975248
---
# Dataset Card for "163762AASRnoDiacs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rombodawg/code_instruct_alpaca_vicuna_wizardlm_56k_backup | ---
license: other
---
Backup of code_instruct_alpaca_vicuna_wizardlm used in rombodawg/MegaCodeTraining112k
Link to the combined dataset bellow
https://huggingface.co/datasets/rombodawg/MegaCodeTraining112k |
open-llm-leaderboard/details_max-2022__test_mistral2 | ---
pretty_name: Evaluation run of max-2022/test_mistral2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [max-2022/test_mistral2](https://huggingface.co/max-2022/test_mistral2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_max-2022__test_mistral2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T22:17:01.815383](https://huggingface.co/datasets/open-llm-leaderboard/details_max-2022__test_mistral2/blob/main/results_2024-02-11T22-17-01.815383.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24710575900369156,\n\
\ \"acc_stderr\": 0.03055560450787355,\n \"acc_norm\": 0.24800687316365916,\n\
\ \"acc_norm_stderr\": 0.0313664960522948,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.490951583190258,\n\
\ \"mc2_stderr\": 0.016977888460336696\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23037542662116042,\n \"acc_stderr\": 0.01230492841874761,\n\
\ \"acc_norm\": 0.2790102389078498,\n \"acc_norm_stderr\": 0.013106784883601338\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2575184226249751,\n\
\ \"acc_stderr\": 0.004363736410689625,\n \"acc_norm\": 0.25323640709022105,\n\
\ \"acc_norm_stderr\": 0.004339764434219064\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.0261998088075619,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.0261998088075619\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.03068302084323101,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.03068302084323101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208837,\n \"\
acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208837\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994106,\n \"\
acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.16161616161616163,\n \"acc_stderr\": 0.026225919863629283,\n \"\
acc_norm\": 0.16161616161616163,\n \"acc_norm_stderr\": 0.026225919863629283\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371215,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958934,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26055045871559634,\n \"acc_stderr\": 0.018819182034850068,\n \"\
acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.018819182034850068\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402543,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402543\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.21487603305785125,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.03834241021419073,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.03834241021419073\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.029058588303748842,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.029058588303748842\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n\
\ \"acc_stderr\": 0.015517322365529627,\n \"acc_norm\": 0.2515964240102171,\n\
\ \"acc_norm_stderr\": 0.015517322365529627\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.02279711027807113,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.02279711027807113\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n\
\ \"acc_stderr\": 0.014173044098303665,\n \"acc_norm\": 0.2346368715083799,\n\
\ \"acc_norm_stderr\": 0.014173044098303665\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02380518652488815,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02380518652488815\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22508038585209003,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.22508038585209003,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886324,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307847,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307847\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443737,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443737\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2536764705882353,\n \"acc_stderr\": 0.026431329870789527,\n\
\ \"acc_norm\": 0.2536764705882353,\n \"acc_norm_stderr\": 0.026431329870789527\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904052,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904052\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n\
\ \"acc_stderr\": 0.029475250236017176,\n \"acc_norm\": 0.22388059701492538,\n\
\ \"acc_norm_stderr\": 0.029475250236017176\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.490951583190258,\n\
\ \"mc2_stderr\": 0.016977888460336696\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.48539857932123126,\n \"acc_stderr\": 0.014046492383275834\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/max-2022/test_mistral2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|arc:challenge|25_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|gsm8k|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hellaswag|10_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T22-17-01.815383.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- '**/details_harness|winogrande|5_2024-02-11T22-17-01.815383.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T22-17-01.815383.parquet'
- config_name: results
data_files:
- split: 2024_02_11T22_17_01.815383
path:
- results_2024-02-11T22-17-01.815383.parquet
- split: latest
path:
- results_2024-02-11T22-17-01.815383.parquet
---
# Dataset Card for Evaluation run of max-2022/test_mistral2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [max-2022/test_mistral2](https://huggingface.co/max-2022/test_mistral2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_max-2022__test_mistral2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T22:17:01.815383](https://huggingface.co/datasets/open-llm-leaderboard/details_max-2022__test_mistral2/blob/main/results_2024-02-11T22-17-01.815383.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24710575900369156,
"acc_stderr": 0.03055560450787355,
"acc_norm": 0.24800687316365916,
"acc_norm_stderr": 0.0313664960522948,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.490951583190258,
"mc2_stderr": 0.016977888460336696
},
"harness|arc:challenge|25": {
"acc": 0.23037542662116042,
"acc_stderr": 0.01230492841874761,
"acc_norm": 0.2790102389078498,
"acc_norm_stderr": 0.013106784883601338
},
"harness|hellaswag|10": {
"acc": 0.2575184226249751,
"acc_stderr": 0.004363736410689625,
"acc_norm": 0.25323640709022105,
"acc_norm_stderr": 0.004339764434219064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.0261998088075619,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.0261998088075619
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.03068302084323101,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.03068302084323101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994106,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16161616161616163,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.16161616161616163,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371215,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958934,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473834,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473834
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.018819182034850068,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.018819182034850068
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402543,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402543
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419073,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419073
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748842,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748842
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529627,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529627
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.02279711027807113,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.02279711027807113
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303665,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303665
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02380518652488815,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02380518652488815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22508038585209003,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.22508038585209003,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886324,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307847,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307847
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443737,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443737
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2536764705882353,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.2536764705882353,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904052,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904052
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017176,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017176
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.490951583190258,
"mc2_stderr": 0.016977888460336696
},
"harness|winogrande|5": {
"acc": 0.48539857932123126,
"acc_stderr": 0.014046492383275834
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hemanth955/Gold-alpaca-med-small-final | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Input
dtype: string
- name: Output
dtype: int64
- name: Instruction
dtype: string
- name: Text
dtype: string
splits:
- name: train
num_bytes: 48641940
num_examples: 27360
download_size: 10550533
dataset_size: 48641940
---
# Dataset Card for "Gold-alpaca-med-small-final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k | ---
pretty_name: Evaluation run of CallComply/SOLAR-10.7B-Instruct-v1.0-128k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CallComply/SOLAR-10.7B-Instruct-v1.0-128k](https://huggingface.co/CallComply/SOLAR-10.7B-Instruct-v1.0-128k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T22:38:12.148949](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k/blob/main/results_2024-01-14T22-38-12.148949.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5736345987046274,\n\
\ \"acc_stderr\": 0.033417579618165875,\n \"acc_norm\": 0.5822139213719528,\n\
\ \"acc_norm_stderr\": 0.03421698352385503,\n \"mc1\": 0.48592411260709917,\n\
\ \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6542262778057006,\n\
\ \"mc2_stderr\": 0.015681013574816827\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892973\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6415056761601274,\n\
\ \"acc_stderr\": 0.004785781979354868,\n \"acc_norm\": 0.8434574785899224,\n\
\ \"acc_norm_stderr\": 0.003626262805442223\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.02989060968628664,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.02989060968628664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099522,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099522\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.04489539350270699,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.04489539350270699\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957543,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806585,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042338,\n\
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042338\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.763302752293578,\n \"acc_stderr\": 0.018224078117299106,\n \"\
acc_norm\": 0.763302752293578,\n \"acc_norm_stderr\": 0.018224078117299106\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990948,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990948\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.015384352284543932,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.015384352284543932\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242836,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172544,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172544\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602653,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.027731258647011994,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.027731258647011994\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155885992,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155885992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777518,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777518\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4975124378109453,\n\
\ \"acc_stderr\": 0.03535490150137288,\n \"acc_norm\": 0.4975124378109453,\n\
\ \"acc_norm_stderr\": 0.03535490150137288\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48592411260709917,\n\
\ \"mc1_stderr\": 0.017496563717042793,\n \"mc2\": 0.6542262778057006,\n\
\ \"mc2_stderr\": 0.015681013574816827\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938256\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \
\ \"acc_stderr\": 0.0070864621279544985\n }\n}\n```"
repo_url: https://huggingface.co/CallComply/SOLAR-10.7B-Instruct-v1.0-128k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|arc:challenge|25_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|gsm8k|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hellaswag|10_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T22-38-12.148949.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- '**/details_harness|winogrande|5_2024-01-14T22-38-12.148949.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T22-38-12.148949.parquet'
- config_name: results
data_files:
- split: 2024_01_14T22_38_12.148949
path:
- results_2024-01-14T22-38-12.148949.parquet
- split: latest
path:
- results_2024-01-14T22-38-12.148949.parquet
---
# Dataset Card for Evaluation run of CallComply/SOLAR-10.7B-Instruct-v1.0-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/SOLAR-10.7B-Instruct-v1.0-128k](https://huggingface.co/CallComply/SOLAR-10.7B-Instruct-v1.0-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T22:38:12.148949](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__SOLAR-10.7B-Instruct-v1.0-128k/blob/main/results_2024-01-14T22-38-12.148949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5736345987046274,
"acc_stderr": 0.033417579618165875,
"acc_norm": 0.5822139213719528,
"acc_norm_stderr": 0.03421698352385503,
"mc1": 0.48592411260709917,
"mc1_stderr": 0.017496563717042793,
"mc2": 0.6542262778057006,
"mc2_stderr": 0.015681013574816827
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892973
},
"harness|hellaswag|10": {
"acc": 0.6415056761601274,
"acc_stderr": 0.004785781979354868,
"acc_norm": 0.8434574785899224,
"acc_norm_stderr": 0.003626262805442223
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.02989060968628664,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.02989060968628664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099522,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099522
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.04489539350270699,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.04489539350270699
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957543,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806585,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915332,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915332
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042338,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042338
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.018224078117299106,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.018224078117299106
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990948,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990948
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.015384352284543932,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.015384352284543932
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242836,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172544,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602653,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.027731258647011994,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.027731258647011994
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.026915003011380154,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.026915003011380154
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155885992,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155885992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777518,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777518
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4975124378109453,
"acc_stderr": 0.03535490150137288,
"acc_norm": 0.4975124378109453,
"acc_norm_stderr": 0.03535490150137288
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48592411260709917,
"mc1_stderr": 0.017496563717042793,
"mc2": 0.6542262778057006,
"mc2_stderr": 0.015681013574816827
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938256
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.0070864621279544985
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/ikebukuro_akiha_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ikebukuro_akiha/池袋晶葉 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ikebukuro_akiha/池袋晶葉 (THE iDOLM@STER: Cinderella Girls), containing 190 images and their tags.
The core tags of this character are `brown_hair, glasses, brown_eyes, twintails, long_hair, semi-rimless_eyewear, bangs, pink-framed_eyewear, under-rim_eyewear, blunt_bangs, bow, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 190 | 180.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikebukuro_akiha_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 190 | 125.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikebukuro_akiha_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 443 | 258.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikebukuro_akiha_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 190 | 165.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikebukuro_akiha_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 443 | 327.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikebukuro_akiha_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ikebukuro_akiha_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, ponytail, ribbed_sweater, solo, turtleneck_sweater, labcoat, looking_at_viewer, simple_background, smile, white_background, large_breasts, red_sweater, blush, hair_bow, long_sleeves, sidelocks, skirt |
| 1 | 18 |  |  |  |  |  | 1girl, solo, serafuku, skirt, labcoat, smile, blush, white_background |
| 2 | 5 |  |  |  |  |  | 1girl, blue_sailor_collar, blue_skirt, labcoat, long_sleeves, looking_at_viewer, pleated_skirt, red_bow, simple_background, solo, hair_bow, serafuku, shirt, white_background, closed_mouth, grin, sitting |
| 3 | 11 |  |  |  |  |  | 1girl, blue_skirt, blush, labcoat, long_sleeves, looking_at_viewer, open_clothes, pleated_skirt, solo, white_background, hair_ribbon, serafuku, simple_background, blue_sailor_collar, red_bow, red_ribbon, sidelocks, blue_shirt, smile, collarbone, v-shaped_eyebrows, closed_mouth, hand_up, red-framed_eyewear, signature, white_shirt |
| 4 | 7 |  |  |  |  |  | 1girl, solo, black_thighhighs, grin, belt, labcoat, rabbit_ears |
| 5 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, nipples, solo, collarbone, pussy, red-framed_eyewear, small_breasts, completely_nude, depth_of_field, open_mouth, sidelocks, standing, sweat, :o, bar_censor, blurry_background, convenient_censoring, cowboy_shot, dutch_angle, groin, hair_bow, medium_breasts, onsen, rectangular_eyewear, red_bow, sitting, steam, v-shaped_eyebrows, very_long_hair |
| 6 | 7 |  |  |  |  |  | 1girl, solo, collarbone, double_bun, midriff, necklace, sidelocks, blush, bracelet, looking_at_viewer, short_sleeves, blue_shorts, clothes_around_waist, clothes_writing, navel, off_shoulder, short_shorts, breasts, crop_top, denim_shorts, grin, hair_bow, hair_ribbon, star_(symbol), white_shirt, x_hair_ornament |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ponytail | ribbed_sweater | solo | turtleneck_sweater | labcoat | looking_at_viewer | simple_background | smile | white_background | large_breasts | red_sweater | blush | hair_bow | long_sleeves | sidelocks | skirt | serafuku | blue_sailor_collar | blue_skirt | pleated_skirt | red_bow | shirt | closed_mouth | grin | sitting | open_clothes | hair_ribbon | red_ribbon | blue_shirt | collarbone | v-shaped_eyebrows | hand_up | red-framed_eyewear | signature | white_shirt | black_thighhighs | belt | rabbit_ears | navel | nipples | pussy | small_breasts | completely_nude | depth_of_field | open_mouth | standing | sweat | :o | bar_censor | blurry_background | convenient_censoring | cowboy_shot | dutch_angle | groin | medium_breasts | onsen | rectangular_eyewear | steam | very_long_hair | double_bun | midriff | necklace | bracelet | short_sleeves | blue_shorts | clothes_around_waist | clothes_writing | off_shoulder | short_shorts | breasts | crop_top | denim_shorts | star_(symbol) | x_hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-----------------|:-------|:---------------------|:----------|:--------------------|:--------------------|:--------|:-------------------|:----------------|:--------------|:--------|:-----------|:---------------|:------------|:--------|:-----------|:---------------------|:-------------|:----------------|:----------|:--------|:---------------|:-------|:----------|:---------------|:--------------|:-------------|:-------------|:-------------|:--------------------|:----------|:---------------------|:------------|:--------------|:-------------------|:-------|:--------------|:--------|:----------|:--------|:----------------|:------------------|:-----------------|:-------------|:-----------|:--------|:-----|:-------------|:--------------------|:-----------------------|:--------------|:--------------|:--------|:-----------------|:--------|:----------------------|:--------|:-----------------|:-------------|:----------|:-----------|:-----------|:----------------|:--------------|:-----------------------|:------------------|:---------------|:---------------|:----------|:-----------|:---------------|:----------------|:------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | | | X | | X | | | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | X | | X | X | X | | X | | | | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | X | | X | X | X | X | X | | | X | | X | X | | X | X | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | | X | X | | X | | | | | | X | | | | X | | | | | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | X | | | X | | | | | | X | X | | X | | | | | | | | | X | | | X | | | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
one-sec-cv12/chunk_57 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 25320557952.0
num_examples: 263624
download_size: 23104462381
dataset_size: 25320557952.0
---
# Dataset Card for "chunk_57"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/metatree_fri_c3_1000_50 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 304920
num_examples: 726
- name: validation
num_bytes: 115080
num_examples: 274
download_size: 504483
dataset_size: 420000
---
# Dataset Card for "metatree_fri_c3_1000_50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lamini/lamini_docs | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1846734.3
num_examples: 1260
- name: test
num_bytes: 205192.7
num_examples: 140
download_size: 698607
dataset_size: 2051927.0
---
# Dataset Card for "lamini_docs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sgjwong/ltedi23-models | ---
license: cc-by-4.0
---
|
Tuana/presidents | ---
dataset_info:
features:
- name: id
dtype: string
- name: content
dtype: string
- name: content_type
dtype: string
- name: meta
struct:
- name: url
dtype: string
- name: _split_id
dtype: int64
- name: id_hash_keys
sequence: string
- name: score
dtype: 'null'
- name: embedding
dtype: 'null'
splits:
- name: train
num_bytes: 9366886
num_examples: 5529
download_size: 4997888
dataset_size: 9366886
---
# Dataset Card for "presidents"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hongboyang/LCSTS_instruction1 | ---
dataset_info:
features:
- name: INPUT
dtype: string
- name: TARGET
dtype: string
splits:
- name: train
num_bytes: 1128722053
num_examples: 2400591
download_size: 693529602
dataset_size: 1128722053
---
# Dataset Card for "LCSTS_instruction1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-Tristan__zero-shot-classification-large-test-Tristan__z-d81307-16956302 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Tristan/zero-shot-classification-large-test
eval_info:
task: text_zero_shot_classification
model: Tristan/opt-66b-copy
metrics: []
dataset_name: Tristan/zero-shot-classification-large-test
dataset_config: Tristan--zero-shot-classification-large-test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: Tristan/opt-66b-copy
* Dataset: Tristan/zero-shot-classification-large-test
* Config: Tristan--zero-shot-classification-large-test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
furry-br/krystal | ---
license: openrail
---
|
liyongsea/empty_function_jupyter | ---
dataset_info:
features:
- name: path
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_event_created_at
dtype: timestamp[us]
- name: gha_updated_at
dtype: timestamp[us]
- name: gha_language
dtype: string
- name: language
dtype: string
- name: is_generated
dtype: bool
- name: is_vendor
dtype: bool
- name: conversion_extension
dtype: string
- name: size
dtype: int64
- name: script
dtype: string
- name: script_size
dtype: int64
splits:
- name: train
num_bytes: 654648.6506
num_examples: 28
download_size: 292451
dataset_size: 654648.6506
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "empty_function_jupyter"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/lawine_sousounofrieren | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Lawine/ラヴィーネ (Sousou no Frieren)
This is the dataset of Lawine/ラヴィーネ (Sousou no Frieren), containing 192 images and their tags.
The core tags of this character are `long_hair, braid, blue_eyes, blunt_bangs, grey_hair, french_braid, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 192 | 124.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lawine_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 192 | 124.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lawine_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 313 | 191.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lawine_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lawine_sousounofrieren',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blue_capelet, blue_dress, long_sleeves, solo, holding_staff, closed_mouth, upper_body, blurry_background, frilled_capelet, outdoors |
| 1 | 14 |  |  |  |  |  | 1girl, holding_staff, long_sleeves, solo, blue_capelet, blue_dress, frilled_capelet, standing, outdoors, closed_mouth, white_thighhighs, cross-laced_footwear, cloudy_sky, very_long_hair, full_body, thigh_boots, tree |
| 2 | 9 |  |  |  |  |  | 1girl, closed_mouth, expressionless, blue_capelet, solo, upper_body, blue_dress, outdoors, tree, looking_at_viewer, forest, frills |
| 3 | 5 |  |  |  |  |  | 1girl, blue_capelet, blue_dress, frilled_capelet, long_sleeves, outdoors, solo, looking_at_viewer, open_mouth, closed_mouth, cloudy_sky |
| 4 | 7 |  |  |  |  |  | 1girl, blue_capelet, closed_mouth, solo, blue_dress, frilled_dress, lace-up_boots, long_sleeves, standing, white_footwear, white_thighhighs, full_body, thigh_boots, frilled_capelet, outdoors, blurry, forest, looking_at_viewer |
| 5 | 8 |  |  |  |  |  | 1girl, blue_dress, closed_mouth, long_sleeves, solo, low-tied_long_hair, very_long_hair, expressionless, frilled_capelet, frilled_dress, from_side, profile |
| 6 | 6 |  |  |  |  |  | blue_capelet, blue_dress, frilled_dress, long_sleeves, sitting_on_person, very_long_hair, 2girls, checkered_floor, frilled_capelet, solo_focus, white_thighhighs, green_shorts, thigh_boots, closed_mouth, low-tied_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_capelet | blue_dress | long_sleeves | solo | holding_staff | closed_mouth | upper_body | blurry_background | frilled_capelet | outdoors | standing | white_thighhighs | cross-laced_footwear | cloudy_sky | very_long_hair | full_body | thigh_boots | tree | expressionless | looking_at_viewer | forest | frills | open_mouth | frilled_dress | lace-up_boots | white_footwear | blurry | low-tied_long_hair | from_side | profile | sitting_on_person | 2girls | checkered_floor | solo_focus | green_shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------------|:---------------|:-------|:----------------|:---------------|:-------------|:--------------------|:------------------|:-----------|:-----------|:-------------------|:-----------------------|:-------------|:-----------------|:------------|:--------------|:-------|:-----------------|:--------------------|:---------|:---------|:-------------|:----------------|:----------------|:-----------------|:---------|:---------------------|:------------|:----------|:--------------------|:---------|:------------------|:-------------|:---------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | | X | X | | | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | X | X | | | | X | | | | | | X | | | X | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | X | | X | | | X | X | X | X | | | | X | X | | | X | X | | | X | X | X | X | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | X | X | | X | | | X | | | | | | X | | | | X | | | | | X | | | | X | X | X | | | | | |
| 6 | 6 |  |  |  |  |  | | X | X | X | | | X | | | X | | | X | | | X | | X | | | | | | | X | | | | X | | | X | X | X | X | X |
|
mar-yam1497/HotPotQA_Mistral_dataset_Top3k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7691946
num_examples: 3000
download_size: 3547124
dataset_size: 7691946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jxie/bbbp | ---
dataset_info:
features:
- name: index
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train_0
num_bytes: 112140
num_examples: 1631
- name: val_0
num_bytes: 18772
num_examples: 204
- name: test_0
num_bytes: 15004
num_examples: 204
- name: train_1
num_bytes: 112140
num_examples: 1631
- name: val_1
num_bytes: 18772
num_examples: 204
- name: test_1
num_bytes: 15004
num_examples: 204
- name: train_2
num_bytes: 112140
num_examples: 1631
- name: val_2
num_bytes: 18772
num_examples: 204
- name: test_2
num_bytes: 15004
num_examples: 204
download_size: 218838
dataset_size: 437748
---
# Dataset Card for "bbbp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ | ---
pretty_name: Evaluation run of TheBloke/Genz-70b-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Genz-70b-GPTQ](https://huggingface.co/TheBloke/Genz-70b-GPTQ) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T00:30:34.342002](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ/blob/main/results_2023-08-31T00%3A30%3A34.342002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7017249416277331,\n\
\ \"acc_stderr\": 0.030832772804323012,\n \"acc_norm\": 0.70569345061239,\n\
\ \"acc_norm_stderr\": 0.03080075128019408,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6228267270427654,\n\
\ \"mc2_stderr\": 0.014836432877772263\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.689205337582155,\n\
\ \"acc_stderr\": 0.004618730353217047,\n \"acc_norm\": 0.8764190400318662,\n\
\ \"acc_norm_stderr\": 0.0032843028764223\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.02557625706125384,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.02557625706125384\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.02141724293632159,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.02141724293632159\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822523,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.023119362758232294,\n\
\ \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.023119362758232294\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279472,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611769,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611769\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073312,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640262,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640262\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941642,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941642\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n\
\ \"acc_stderr\": 0.011935626313999876,\n \"acc_norm\": 0.8722860791826309,\n\
\ \"acc_norm_stderr\": 0.011935626313999876\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252562,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252562\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5754189944134078,\n\
\ \"acc_stderr\": 0.01653117099327888,\n \"acc_norm\": 0.5754189944134078,\n\
\ \"acc_norm_stderr\": 0.01653117099327888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398205,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149897,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149897\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5534550195567145,\n\
\ \"acc_stderr\": 0.012697046024399654,\n \"acc_norm\": 0.5534550195567145,\n\
\ \"acc_norm_stderr\": 0.012697046024399654\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103135,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103135\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7630718954248366,\n \"acc_stderr\": 0.017201662169789772,\n \
\ \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.017201662169789772\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.025172984350155754,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.025172984350155754\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6228267270427654,\n\
\ \"mc2_stderr\": 0.014836432877772263\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Genz-70b-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|arc:challenge|25_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hellaswag|10_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T00:30:34.342002.parquet'
- config_name: results
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- results_2023-08-31T00:30:34.342002.parquet
- split: latest
path:
- results_2023-08-31T00:30:34.342002.parquet
---
# Dataset Card for Evaluation run of TheBloke/Genz-70b-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Genz-70b-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Genz-70b-GPTQ](https://huggingface.co/TheBloke/Genz-70b-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T00:30:34.342002](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ/blob/main/results_2023-08-31T00%3A30%3A34.342002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7017249416277331,
"acc_stderr": 0.030832772804323012,
"acc_norm": 0.70569345061239,
"acc_norm_stderr": 0.03080075128019408,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6228267270427654,
"mc2_stderr": 0.014836432877772263
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.689205337582155,
"acc_stderr": 0.004618730353217047,
"acc_norm": 0.8764190400318662,
"acc_norm_stderr": 0.0032843028764223
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.02557625706125384,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.02557625706125384
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632159,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632159
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822523,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.023119362758232294,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.023119362758232294
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611769,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611769
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073312,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640262,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640262
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941642,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941642
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999876,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999876
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252562,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252562
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5754189944134078,
"acc_stderr": 0.01653117099327888,
"acc_norm": 0.5754189944134078,
"acc_norm_stderr": 0.01653117099327888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398205,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149897,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149897
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5534550195567145,
"acc_stderr": 0.012697046024399654,
"acc_norm": 0.5534550195567145,
"acc_norm_stderr": 0.012697046024399654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103135,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103135
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.017201662169789772,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.017201662169789772
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.025172984350155754,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.025172984350155754
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6228267270427654,
"mc2_stderr": 0.014836432877772263
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
joey234/mmlu-college_biology-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 9147.1875
num_examples: 27
download_size: 7857
dataset_size: 9147.1875
---
# Dataset Card for "mmlu-college_biology-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_59_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12541572
num_examples: 23244
download_size: 6485582
dataset_size: 12541572
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_59_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jeiku__NarrativeNexus_7B | ---
pretty_name: Evaluation run of jeiku/NarrativeNexus_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeiku/NarrativeNexus_7B](https://huggingface.co/jeiku/NarrativeNexus_7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeiku__NarrativeNexus_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T01:30:29.349287](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__NarrativeNexus_7B/blob/main/results_2024-02-16T01-30-29.349287.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6331502373053775,\n\
\ \"acc_stderr\": 0.032649477056743835,\n \"acc_norm\": 0.6360612367088411,\n\
\ \"acc_norm_stderr\": 0.03330403787596569,\n \"mc1\": 0.46878824969400246,\n\
\ \"mc1_stderr\": 0.017469364874577537,\n \"mc2\": 0.6394506791157332,\n\
\ \"mc2_stderr\": 0.015272071804569947\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.01412459788184446,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6773551085441147,\n\
\ \"acc_stderr\": 0.004665327309399188,\n \"acc_norm\": 0.8573989245170285,\n\
\ \"acc_norm_stderr\": 0.003489509493001621\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709695,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709695\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630457,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46878824969400246,\n\
\ \"mc1_stderr\": 0.017469364874577537,\n \"mc2\": 0.6394506791157332,\n\
\ \"mc2_stderr\": 0.015272071804569947\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5178165276724791,\n \
\ \"acc_stderr\": 0.013763738379867933\n }\n}\n```"
repo_url: https://huggingface.co/jeiku/NarrativeNexus_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|arc:challenge|25_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|gsm8k|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hellaswag|10_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T01-30-29.349287.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- '**/details_harness|winogrande|5_2024-02-16T01-30-29.349287.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T01-30-29.349287.parquet'
- config_name: results
data_files:
- split: 2024_02_16T01_30_29.349287
path:
- results_2024-02-16T01-30-29.349287.parquet
- split: latest
path:
- results_2024-02-16T01-30-29.349287.parquet
---
# Dataset Card for Evaluation run of jeiku/NarrativeNexus_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeiku/NarrativeNexus_7B](https://huggingface.co/jeiku/NarrativeNexus_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeiku__NarrativeNexus_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T01:30:29.349287](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__NarrativeNexus_7B/blob/main/results_2024-02-16T01-30-29.349287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6331502373053775,
"acc_stderr": 0.032649477056743835,
"acc_norm": 0.6360612367088411,
"acc_norm_stderr": 0.03330403787596569,
"mc1": 0.46878824969400246,
"mc1_stderr": 0.017469364874577537,
"mc2": 0.6394506791157332,
"mc2_stderr": 0.015272071804569947
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.01412459788184446,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.6773551085441147,
"acc_stderr": 0.004665327309399188,
"acc_norm": 0.8573989245170285,
"acc_norm_stderr": 0.003489509493001621
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709695,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709695
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630457,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46878824969400246,
"mc1_stderr": 0.017469364874577537,
"mc2": 0.6394506791157332,
"mc2_stderr": 0.015272071804569947
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.5178165276724791,
"acc_stderr": 0.013763738379867933
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/spitfire_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of spitfire/Spitfire/喷火 (Girls' Frontline)
This is the dataset of spitfire/Spitfire/喷火 (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are `long_hair, hat, green_eyes, top_hat, grey_hair, breasts, bangs, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 20.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 11.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 22.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 18.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 30.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/spitfire_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_gloves, dress, belt, handgun, necktie, bare_shoulders, boots, brown_hair, holding_gun, official_alternate_costume, pantyhose, small_breasts, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_gloves | dress | belt | handgun | necktie | bare_shoulders | boots | brown_hair | holding_gun | official_alternate_costume | pantyhose | small_breasts | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------|:-------|:----------|:----------|:-----------------|:--------|:-------------|:--------------|:-----------------------------|:------------|:----------------|:-------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/moriyama_shiemi | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Moriyama Shiemi (青の祓魔師)
This is the dataset of Moriyama Shiemi (青の祓魔師), containing 258 images and their tags.
The core tags of this character are `blonde_hair, short_hair, green_eyes, hair_ornament, hair_flower, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 258 | 198.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moriyama_shiemi/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 258 | 160.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moriyama_shiemi/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 434 | 266.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moriyama_shiemi/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 258 | 190.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moriyama_shiemi/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 434 | 309.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/moriyama_shiemi/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/moriyama_shiemi',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, school_uniform, skirt, solo, smile, flower, open_mouth, blush, bow, white_thighhighs, zettai_ryouiki |
| 1 | 5 |  |  |  |  |  | 1girl, blush, bow, open_mouth, smile, solo, school_uniform, hair_ribbon, ahoge, aqua_eyes, necktie |
| 2 | 18 |  |  |  |  |  | 1girl, flower, kimono, smile, solo, open_mouth, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | skirt | solo | smile | flower | open_mouth | blush | bow | white_thighhighs | zettai_ryouiki | hair_ribbon | ahoge | aqua_eyes | necktie | kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-------|:--------|:---------|:-------------|:--------|:------|:-------------------|:-----------------|:--------------|:--------|:------------|:----------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | X | | X | X | X | | | X | X | X | X | |
| 2 | 18 |  |  |  |  |  | X | | | X | X | X | X | X | | | | | | | | X |
|
irds/neumarco_fa_dev_judged | ---
pretty_name: '`neumarco/fa/dev/judged`'
viewer: false
source_datasets: ['irds/neumarco_fa', 'irds/neumarco_fa_dev']
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/fa/dev/judged`
The `neumarco/fa/dev/judged` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/fa/dev/judged).
# Data
This dataset provides:
- `queries` (i.e., topics); count=55,578
- For `docs`, use [`irds/neumarco_fa`](https://huggingface.co/datasets/irds/neumarco_fa)
- For `qrels`, use [`irds/neumarco_fa_dev`](https://huggingface.co/datasets/irds/neumarco_fa_dev)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/neumarco_fa_dev_judged', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
jp1924/JeollaSpeech | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: standard_form
dtype: string
- name: dialect_form
dtype: string
- name: start
dtype: float32
- name: end
dtype: float32
- name: note
dtype: string
- name: eojeolList
list:
- name: id
dtype: int8
- name: eojeol
dtype: string
- name: standard
dtype: string
- name: isDialect
dtype: bool
- name: speaker
struct:
- name: id
dtype: string
- name: name
dtype: string
- name: age
dtype: string
- name: occupation
dtype: string
- name: sex
dtype: string
- name: birthplace
dtype: string
- name: principal_residence
dtype: string
- name: current_residence
dtype: string
- name: education
dtype: string
- name: metadata
struct:
- name: title
dtype: string
- name: creator
dtype: string
- name: distributor
dtype: string
- name: year
dtype: string
- name: category
dtype: string
- name: annotation_level
list: string
- name: sampling
dtype: string
- name: author
dtype: string
- name: publisher
dtype: string
- name: date
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 520213137698.024
num_examples: 1988867
- name: validation
num_bytes: 61448267920.464
num_examples: 275137
download_size: 557887316021
dataset_size: 581661405618.488
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
yzhuang/autotree_pmlb_10000_Hill_Valley_without_noise_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 236440000
num_examples: 10000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 179483399
dataset_size: 472880000
---
# Dataset Card for "autotree_pmlb_10000_Hill_Valley_without_noise_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yoroizuka_mizore_soundeuphonium | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yoroizuka Mizore/鎧塚みぞれ/铠冢霙/のぞみぞ (Sound! Euphonium)
This is the dataset of Yoroizuka Mizore/鎧塚みぞれ/铠冢霙/のぞみぞ (Sound! Euphonium), containing 228 images and their tags.
The core tags of this character are `long_hair, blue_hair, red_eyes, black_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 228 | 159.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yoroizuka_mizore_soundeuphonium/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 228 | 159.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yoroizuka_mizore_soundeuphonium/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 400 | 267.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yoroizuka_mizore_soundeuphonium/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yoroizuka_mizore_soundeuphonium',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blue_sailor_collar, blurry_background, blush, kitauji_high_school_uniform, outdoors, serafuku, solo, white_shirt, blue_neckerchief, blue_skirt, closed_mouth, pleated_skirt, school_bag, short_sleeves, standing, tree, black_bag, looking_to_the_side |
| 1 | 15 |  |  |  |  |  | blush, kitauji_high_school_uniform, serafuku, 1girl, blue_sailor_collar, white_shirt, solo, indoors, parted_lips, open_mouth, blurry_background, closed_mouth, window |
| 2 | 12 |  |  |  |  |  | blue_sailor_collar, blush, kitauji_high_school_uniform, serafuku, white_shirt, 2girls, neckerchief, closed_mouth, looking_at_viewer, solo_focus, blurry |
| 3 | 11 |  |  |  |  |  | blue_neckerchief, blue_sailor_collar, kitauji_high_school_uniform, serafuku, short_sleeves, white_shirt, 2girls, blue_skirt, blush, indoors, pleated_skirt, brown_hair, chair, closed_mouth, sitting, solo_focus, classroom, instrument |
| 4 | 5 |  |  |  |  |  | 1girl, blue_neckerchief, blue_sailor_collar, blush, chair, classroom, from_side, indoors, kitauji_high_school_uniform, serafuku, solo, white_shirt, window, closed_mouth, short_sleeves, sitting, blurry, flute, holding_instrument |
| 5 | 30 |  |  |  |  |  | kitauji_high_school_uniform, serafuku, 1girl, solo, white_sailor_collar, holding_instrument, brown_shirt, long_sleeves, playing_instrument, closed_mouth, blurry_background, blue_neckerchief |
| 6 | 5 |  |  |  |  |  | 1girl, brown_shirt, brown_skirt, holding_instrument, kitauji_high_school_uniform, long_sleeves, pleated_skirt, serafuku, standing, white_sailor_collar, solo, blue_neckerchief, blush, from_side |
| 7 | 10 |  |  |  |  |  | blush, brown_shirt, closed_mouth, kitauji_high_school_uniform, serafuku, solo_focus, white_sailor_collar, blue_neckerchief, 2girls, long_sleeves, brown_skirt, blurry_background, pleated_skirt, sitting, socks |
| 8 | 12 |  |  |  |  |  | blush, school_uniform, 1girl, collared_shirt, green_jacket, red_bowtie, solo, white_shirt, closed_mouth, blazer, long_sleeves, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_sailor_collar | blurry_background | blush | kitauji_high_school_uniform | outdoors | serafuku | solo | white_shirt | blue_neckerchief | blue_skirt | closed_mouth | pleated_skirt | school_bag | short_sleeves | standing | tree | black_bag | looking_to_the_side | indoors | parted_lips | open_mouth | window | 2girls | neckerchief | looking_at_viewer | solo_focus | blurry | brown_hair | chair | sitting | classroom | instrument | from_side | flute | holding_instrument | white_sailor_collar | brown_shirt | long_sleeves | playing_instrument | brown_skirt | socks | school_uniform | collared_shirt | green_jacket | red_bowtie | blazer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:--------------------|:--------|:------------------------------|:-----------|:-----------|:-------|:--------------|:-------------------|:-------------|:---------------|:----------------|:-------------|:----------------|:-----------|:-------|:------------|:----------------------|:----------|:--------------|:-------------|:---------|:---------|:--------------|:--------------------|:-------------|:---------|:-------------|:--------|:----------|:------------|:-------------|:------------|:--------|:---------------------|:----------------------|:--------------|:---------------|:---------------------|:--------------|:--------|:-----------------|:-----------------|:---------------|:-------------|:---------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | | X | | X | X | | X | | X | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | | X | | X | X | | X | | X | X | X | X | X | | X | | | | | X | | | | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | X | | X | X | X | X | | X | | | X | | | | | X | | | X | | | | | X | | X | X | X | | X | X | X | | | | | | | | | | | |
| 5 | 30 |  |  |  |  |  | X | | X | | X | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | X | | | X | | | X | | | | | | | | | | | | | | | | | | X | | X | X | X | X | | X | | | | | | |
| 7 | 10 |  |  |  |  |  | | | X | X | X | | X | | | X | | X | X | | | | | | | | | | | X | | | X | | | | X | | | | | | X | X | X | | X | X | | | | | |
| 8 | 12 |  |  |  |  |  | X | | | X | | | | X | X | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | X | X | X | X | X |
|
chrisjay/crowd-speech-africa | ---
extra_gated_prompt: "You agree to not attempt to determine the identity of individuals in this dataset"
extra_gated_fields:
Name: text
Affiliation: text
Email: text
I agree to not attempt to determine the identity of speakers in this dataset: checkbox
---
|
ai-aerospace/ams_data_train_mistral_v0.1_100 | ---
license: apache-2.0
base-model: TheBloke/Llama-2-7B-Chat-GGUF
---
Question and answer pairs for the first 100 entries of aerospace mechanism symposia 5000 word chunk entries. Full file of entries is here: https://github.com/dsmueller3760/aerospace_chatbot/blob/llm_training/data/AMS/ams_data_answers.jsonl
See this repository for details: https://github.com/dsmueller3760/aerospace_chatbot/tree/main
Prompts generated using TheBloke/Llama-2-7B-Chat-GGUF
Format representative of mistral's instruct llms:
* https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
* Example dataset: https://huggingface.co/datasets/centroIA/MistralInstructScenarios
`<s>[INST] {prompt} [/INST]` |
NicholasSynovic/Free-AutoTrain-VEAA | ---
license: agpl-3.0
task_categories:
- text-classification
language:
- en
pretty_name: Victorian Era Authorship Attribution Data Set (For Free AutoTrain Account)
size_categories:
- 1K<n<10K
source_datasets:
- NicholasSynovic/Victorian-Era-Authorship-Attribution
---
# Free AutoTrain VEAA
> Victorian Era Authorship Attribution Data Set (For Free AutoTrain Account)
## About
See the [original HF-hosted dataset](https://huggingface.co/datasets/NicholasSynovic/Victorian-Era-Authorship-Attribution) for more information.
The code to generate this dataset came from this [GitHub Repo](https://github.com/NicholasSynovic/nlp-victorianAuthor). |
gmongaras/BERT_Base_Cased_512_Dataset_Mapped | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 52875464012.02522
num_examples: 136226984
download_size: 17583618282
dataset_size: 52875464012.02522
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Dataset using the bert-cased tokenizer, cutoff sentences to 512 length (not sentence pairs), all sentence pairs extracted.
Original datasets:
https://huggingface.co/datasets/bookcorpus
https://huggingface.co/datasets/wikipedia Variant: 20220301.en
Mapped from: https://huggingface.co/datasets/gmongaras/BERT_Base_Cased_512_Dataset |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/245282ee | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1331
dataset_size: 182
---
# Dataset Card for "245282ee"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/linde_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of linde (Fire Emblem)
This is the dataset of linde (Fire Emblem), containing 168 images and their tags.
The core tags of this character are `brown_hair, long_hair, ponytail, brown_eyes, breasts, very_long_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 168 | 187.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linde_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 168 | 116.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linde_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 378 | 232.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linde_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 168 | 170.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linde_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 378 | 310.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/linde_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/linde_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, solo, circlet, looking_at_viewer, cleavage, navel, smile, blush, hair_ornament, pink_bikini, open_mouth, simple_background |
| 1 | 5 |  |  |  |  |  | 1girl, circlet, earrings, solo, blush, open_mouth, smile, looking_at_viewer, nipples, one_eye_closed |
| 2 | 9 |  |  |  |  |  | 1girl, circlet, solo, smile, looking_at_viewer, bare_shoulders, blush, armlet, cleavage, open_mouth, pink_dress, medium_breasts |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, belt, full_body, hair_ornament, knee_boots, medium_breasts, side_slit, solo, white_dress, white_footwear, absurdly_long_hair, bangs, blush, collarbone, jewelry, long_dress, open_mouth, simple_background, sleeveless_dress, thighs, white_background, circlet, holding_book, leg_up, :d, armpits, hand_up, looking_at_viewer, open_book, pelvic_curtain |
| 4 | 25 |  |  |  |  |  | 1girl, hetero, nipples, solo_focus, penis, blush, 1boy, sex, open_mouth, vaginal, circlet, mosaic_censoring, cum_in_pussy, spread_legs, nude, cum_on_body, facial, navel, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | circlet | looking_at_viewer | cleavage | navel | smile | blush | hair_ornament | pink_bikini | open_mouth | simple_background | earrings | nipples | one_eye_closed | bare_shoulders | armlet | pink_dress | medium_breasts | belt | full_body | knee_boots | side_slit | white_dress | white_footwear | absurdly_long_hair | bangs | collarbone | jewelry | long_dress | sleeveless_dress | thighs | white_background | holding_book | leg_up | :d | armpits | hand_up | open_book | pelvic_curtain | hetero | solo_focus | penis | 1boy | sex | vaginal | mosaic_censoring | cum_in_pussy | spread_legs | nude | cum_on_body | facial | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------------------|:-----------|:--------|:--------|:--------|:----------------|:--------------|:-------------|:--------------------|:-----------|:----------|:-----------------|:-----------------|:---------|:-------------|:-----------------|:-------|:------------|:-------------|:------------|:--------------|:-----------------|:---------------------|:--------|:-------------|:----------|:-------------|:-------------------|:---------|:-------------------|:---------------|:---------|:-----|:----------|:----------|:------------|:-----------------|:---------|:-------------|:--------|:-------|:------|:----------|:-------------------|:---------------|:--------------|:-------|:--------------|:---------|:--------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | | | | X | X | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 4 | 25 |  |  |  |  |  | X | | X | | | X | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
zirui3/cuad-instructions | ---
license: cc-by-4.0
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: text
sequence: string
- name: answer_start
sequence: int64
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 2933858226
num_examples: 44900
- name: test
num_bytes: 397434014
num_examples: 8364
download_size: 6827533
dataset_size: 3331292240
---
|
ChristophSchuhmann/OpenClip-B32-KNN-Captioner | ---
license: apache-2.0
---
|
KoziCreative/Testing | ---
license: afl-3.0
---
|
colkassad/map_navigation_v1 | ---
license: mit
---
|
open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1 | ---
pretty_name: Evaluation run of logicker/SkkuDS-DPO-72B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [logicker/SkkuDS-DPO-72B-v1](https://huggingface.co/logicker/SkkuDS-DPO-72B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T10:55:52.095277](https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1/blob/main/results_2024-02-16T10-55-52.095277.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7681185312998495,\n\
\ \"acc_stderr\": 0.02797672385731024,\n \"acc_norm\": 0.7728008468755523,\n\
\ \"acc_norm_stderr\": 0.02849748439769033,\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.595432675425976,\n\
\ \"mc2_stderr\": 0.014511387340720846\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131172,\n\
\ \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6671977693686517,\n\
\ \"acc_stderr\": 0.004702533775930293,\n \"acc_norm\": 0.8599880501892053,\n\
\ \"acc_norm_stderr\": 0.0034629026011361893\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846934,\n\
\ \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846934\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n\
\ \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n\
\ \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8042553191489362,\n \"acc_stderr\": 0.025937853139977148,\n\
\ \"acc_norm\": 0.8042553191489362,\n \"acc_norm_stderr\": 0.025937853139977148\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747549,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747549\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7116402116402116,\n \"acc_stderr\": 0.02333065405453588,\n \"\
acc_norm\": 0.7116402116402116,\n \"acc_norm_stderr\": 0.02333065405453588\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.5952380952380952,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"\
acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280459,\n \"\
acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280459\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\"\
: 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723333,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n\
\ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.01967163241310029,\n \
\ \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.01967163241310029\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.030485538042484616,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.030485538042484616\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\
\ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5562913907284768,\n \"acc_stderr\": 0.04056527902281732,\n \"\
acc_norm\": 0.5562913907284768,\n \"acc_norm_stderr\": 0.04056527902281732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.926605504587156,\n \"acc_stderr\": 0.011180976446357573,\n \"\
acc_norm\": 0.926605504587156,\n \"acc_norm_stderr\": 0.011180976446357573\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6990740740740741,\n \"acc_stderr\": 0.03128039084329883,\n \"\
acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.03128039084329883\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640273,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640273\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073892,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073892\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n\
\ \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.6607142857142857,\n\
\ \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9144316730523627,\n\
\ \"acc_stderr\": 0.010002965568647285,\n \"acc_norm\": 0.9144316730523627,\n\
\ \"acc_norm_stderr\": 0.010002965568647285\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442262,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442262\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6424581005586593,\n\
\ \"acc_stderr\": 0.01602939447489489,\n \"acc_norm\": 0.6424581005586593,\n\
\ \"acc_norm_stderr\": 0.01602939447489489\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043728,\n\
\ \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043728\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n\
\ \"acc_stderr\": 0.02135534302826405,\n \"acc_norm\": 0.8295819935691319,\n\
\ \"acc_norm_stderr\": 0.02135534302826405\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.01924252622654454,\n\
\ \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.01924252622654454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.624113475177305,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.624113475177305,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6140808344198174,\n\
\ \"acc_stderr\": 0.012433398911476134,\n \"acc_norm\": 0.6140808344198174,\n\
\ \"acc_norm_stderr\": 0.012433398911476134\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.023345163616544838,\n\
\ \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.023345163616544838\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8088235294117647,\n \"acc_stderr\": 0.015908290136278067,\n \
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.015908290136278067\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824667,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824667\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.595432675425976,\n\
\ \"mc2_stderr\": 0.014511387340720846\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480330996\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \
\ \"acc_stderr\": 0.013059111935831497\n }\n}\n```"
repo_url: https://huggingface.co/logicker/SkkuDS-DPO-72B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|arc:challenge|25_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|gsm8k|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hellaswag|10_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T10-55-52.095277.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- '**/details_harness|winogrande|5_2024-02-16T10-55-52.095277.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T10-55-52.095277.parquet'
- config_name: results
data_files:
- split: 2024_02_16T10_55_52.095277
path:
- results_2024-02-16T10-55-52.095277.parquet
- split: latest
path:
- results_2024-02-16T10-55-52.095277.parquet
---
# Dataset Card for Evaluation run of logicker/SkkuDS-DPO-72B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [logicker/SkkuDS-DPO-72B-v1](https://huggingface.co/logicker/SkkuDS-DPO-72B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T10:55:52.095277](https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1/blob/main/results_2024-02-16T10-55-52.095277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7681185312998495,
"acc_stderr": 0.02797672385731024,
"acc_norm": 0.7728008468755523,
"acc_norm_stderr": 0.02849748439769033,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.595432675425976,
"mc2_stderr": 0.014511387340720846
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131172,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6671977693686517,
"acc_stderr": 0.004702533775930293,
"acc_norm": 0.8599880501892053,
"acc_norm_stderr": 0.0034629026011361893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846934,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846934
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8042553191489362,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.8042553191489362,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7116402116402116,
"acc_stderr": 0.02333065405453588,
"acc_norm": 0.7116402116402116,
"acc_norm_stderr": 0.02333065405453588
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280459,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280459
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723333,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.01967163241310029,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.01967163241310029
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.030485538042484616,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030485538042484616
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5562913907284768,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.5562913907284768,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.926605504587156,
"acc_stderr": 0.011180976446357573,
"acc_norm": 0.926605504587156,
"acc_norm_stderr": 0.011180976446357573
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6990740740740741,
"acc_stderr": 0.03128039084329883,
"acc_norm": 0.6990740740740741,
"acc_norm_stderr": 0.03128039084329883
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640273,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640273
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073892,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073892
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.044939490686135404,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.044939490686135404
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9144316730523627,
"acc_stderr": 0.010002965568647285,
"acc_norm": 0.9144316730523627,
"acc_norm_stderr": 0.010002965568647285
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442262,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442262
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6424581005586593,
"acc_stderr": 0.01602939447489489,
"acc_norm": 0.6424581005586593,
"acc_norm_stderr": 0.01602939447489489
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043728,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043728
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.02135534302826405,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.02135534302826405
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.01924252622654454,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.01924252622654454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.624113475177305,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.624113475177305,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6140808344198174,
"acc_stderr": 0.012433398911476134,
"acc_norm": 0.6140808344198174,
"acc_norm_stderr": 0.012433398911476134
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.023345163616544838,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.023345163616544838
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.015908290136278067,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.015908290136278067
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824667,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824667
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594194,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594194
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.595432675425976,
"mc2_stderr": 0.014511387340720846
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480330996
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831497
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.