datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
LuangMV97/EmpathetiCounseling_Previo | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 13883804.45384577
num_examples: 47998
- name: test
num_bytes: 5771364.976021614
num_examples: 12002
download_size: 13237318
dataset_size: 19655169.429867383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
cjvt/sentinews | ---
annotations_creators:
- crowdsourced
language:
- sl
language_creators:
- found
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: SentiNews
size_categories: []
source_datasets:
- original
tags:
- slovenian sentiment
- news articles
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card for SentiNews
## Dataset Description
- **Homepage:** https://github.com/19Joey85/Sentiment-annotated-news-corpus-and-sentiment-lexicon-in-Slovene
- **Paper:** Bučar, J., Žnidaršič, M. & Povh, J. Annotated news corpora and a lexicon for sentiment analysis in Slovene. Lang Resources & Evaluation 52, 895–919 (2018). https://doi.org/10.1007/s10579-018-9413-3
### Dataset Summary
SentiNews is a Slovenian sentiment classification dataset, consisting of news articles manually annotated with their sentiment by between two and six annotators.
It is annotated at three granularities:
- document-level (config `document_level`, 10 427 documents),
- paragraph-level (config `paragraph_level`, 89 999 paragraphs), and
- sentence-level (config `sentence_level`, 168 899 sentences).
### Supported Tasks and Leaderboards
Sentiment classification, three classes (negative, neutral, positive).
### Languages
Slovenian.
## Dataset Structure
### Data Instances
A sample instance from the sentence-level config:
```
{
'nid': 2,
'content': 'Vilo Prešeren je na dražbi ministrstva za obrambo kupilo nepremičninsko podjetje Condor Real s sedežem v Lescah.',
'sentiment': 'neutral',
'pid': 1,
'sid': 1
}
```
### Data Fields
The data fields are similar among all three configs, with the only difference being the IDs.
- `nid`: a uint16 containing a unique ID of the news article (document).
- `content`: a string containing the body of the news article
- `sentiment`: the sentiment of the instance
- `pid`: a uint8 containing the consecutive number of the paragraph inside the current news article, **not unique** (present in the configs `paragraph_level` and `sentence_level`)
- `sid`: a uint8 containing the consecutive number of the sentence inside the current paragraph, **not unique** (present in the config `sentence_level`)
## Additional Information
### Dataset Curators
Jože Bučar, Martin Žnidaršič, Janez Povh.
### Licensing Information
CC BY-SA 4.0
### Citation Information
```
@article{buvcar2018annotated,
title={Annotated news corpora and a lexicon for sentiment analysis in Slovene},
author={Bu{\v{c}}ar, Jo{\v{z}}e and {\v{Z}}nidar{\v{s}}i{\v{c}}, Martin and Povh, Janez},
journal={Language Resources and Evaluation},
volume={52},
number={3},
pages={895--919},
year={2018},
publisher={Springer}
}
```
### Contributions
Thanks to [@matejklemen](https://github.com/matejklemen) for adding this dataset.
|
TheFinAI/flare-fomc | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: test
num_bytes: 384180
num_examples: 496
download_size: 140144
dataset_size: 384180
---
# Dataset Card for "flare-fomc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
r0ll/kussia | ---
license: openrail
language:
- ru
---
Voice https://www.twitch.tv/kussia88 RVC v2 350 epoch |
open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B | ---
pretty_name: Evaluation run of smelborp/MixtralOrochi8x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [smelborp/MixtralOrochi8x7B](https://huggingface.co/smelborp/MixtralOrochi8x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T02:24:04.286101](https://huggingface.co/datasets/open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B/blob/main/results_2023-12-30T02-24-04.286101.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6933650457336848,\n\
\ \"acc_stderr\": 0.030427906094644037,\n \"acc_norm\": 0.7040806942153237,\n\
\ \"acc_norm_stderr\": 0.03106416581410797,\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6399397085586839,\n\
\ \"mc2_stderr\": 0.015220747814252549\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441379,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725225\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6814379605656243,\n\
\ \"acc_stderr\": 0.00464966527389064,\n \"acc_norm\": 0.8609838677554272,\n\
\ \"acc_norm_stderr\": 0.0034525630964691227\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.02512576648482785,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.02512576648482785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565656,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565656\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.046774730044912,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.046774730044912\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5211640211640212,\n \"acc_stderr\": 0.025728230952130723,\n \"\
acc_norm\": 0.5211640211640212,\n \"acc_norm_stderr\": 0.025728230952130723\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8387096774193549,\n \"acc_stderr\": 0.020923327006423298,\n \"\
acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.020923327006423298\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n \"\
acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.025859164122051453,\n\
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.025859164122051453\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588957,\n \"\
acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588957\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846315,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.02838039114709471,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.02838039114709471\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562586,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562586\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n\
\ \"acc_stderr\": 0.011935626313999874,\n \"acc_norm\": 0.8722860791826309,\n\
\ \"acc_norm_stderr\": 0.011935626313999874\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.464804469273743,\n\
\ \"acc_stderr\": 0.016681020931076655,\n \"acc_norm\": 0.464804469273743,\n\
\ \"acc_norm_stderr\": 0.016681020931076655\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.0231527224394023,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.0231527224394023\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.021751866060815875,\n\
\ \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.021751866060815875\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5260756192959583,\n\
\ \"acc_stderr\": 0.012752858346533143,\n \"acc_norm\": 0.5260756192959583,\n\
\ \"acc_norm_stderr\": 0.012752858346533143\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.025187786660227255,\n\
\ \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.025187786660227255\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7369281045751634,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.7369281045751634,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072878,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072878\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6399397085586839,\n\
\ \"mc2_stderr\": 0.015220747814252549\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.011268519971577682\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1728582259287339,\n \
\ \"acc_stderr\": 0.010415432246200583\n }\n}\n```"
repo_url: https://huggingface.co/smelborp/MixtralOrochi8x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-24-04.286101.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- '**/details_harness|winogrande|5_2023-12-30T02-24-04.286101.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T02-24-04.286101.parquet'
- config_name: results
data_files:
- split: 2023_12_30T02_24_04.286101
path:
- results_2023-12-30T02-24-04.286101.parquet
- split: latest
path:
- results_2023-12-30T02-24-04.286101.parquet
---
# Dataset Card for Evaluation run of smelborp/MixtralOrochi8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [smelborp/MixtralOrochi8x7B](https://huggingface.co/smelborp/MixtralOrochi8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:24:04.286101](https://huggingface.co/datasets/open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B/blob/main/results_2023-12-30T02-24-04.286101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6933650457336848,
"acc_stderr": 0.030427906094644037,
"acc_norm": 0.7040806942153237,
"acc_norm_stderr": 0.03106416581410797,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6399397085586839,
"mc2_stderr": 0.015220747814252549
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441379,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725225
},
"harness|hellaswag|10": {
"acc": 0.6814379605656243,
"acc_stderr": 0.00464966527389064,
"acc_norm": 0.8609838677554272,
"acc_norm_stderr": 0.0034525630964691227
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.02512576648482785,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.02512576648482785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565656,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565656
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.046774730044912,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.046774730044912
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5211640211640212,
"acc_stderr": 0.025728230952130723,
"acc_norm": 0.5211640211640212,
"acc_norm_stderr": 0.025728230952130723
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423298,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423298
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603918,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.025859164122051453,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.025859164122051453
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588957,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588957
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846315,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.02838039114709471,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.02838039114709471
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562586,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562586
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999874,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999874
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.464804469273743,
"acc_stderr": 0.016681020931076655,
"acc_norm": 0.464804469273743,
"acc_norm_stderr": 0.016681020931076655
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.0231527224394023,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.0231527224394023
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.021751866060815875,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.021751866060815875
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5260756192959583,
"acc_stderr": 0.012752858346533143,
"acc_norm": 0.5260756192959583,
"acc_norm_stderr": 0.012752858346533143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.025187786660227255,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.025187786660227255
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7369281045751634,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.7369281045751634,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072878,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072878
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6399397085586839,
"mc2_stderr": 0.015220747814252549
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.011268519971577682
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.010415432246200583
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Athiwat/brain2 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 5590698.0
num_examples: 5
download_size: 5591635
dataset_size: 5590698.0
---
# Dataset Card for "brain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidgaofc/d_shadow_inout | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 927049
num_examples: 3280
download_size: 371018
dataset_size: 927049
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/shiki_eiki_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shiki_eiki/四季映姫・ヤマザナドゥ/四季映姫/시키에이키야마자나두 (Touhou)
This is the dataset of shiki_eiki/四季映姫・ヤマザナドゥ/四季映姫/시키에이키야마자나두 (Touhou), containing 500 images and their tags.
The core tags of this character are `green_hair, hat, short_hair, blue_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 601.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 366.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1145 | 750.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 543.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1145 | 1018.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shiki_eiki_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, blue_vest, long_sleeves, solo, white_shirt, frilled_hat, looking_at_viewer, simple_background, bangs, epaulettes, holding, rod_of_remorse, white_background, blue_headwear, black_skirt, blush, bow, open_mouth, upper_body, closed_mouth |
| 1 | 10 |  |  |  |  |  | 1girl, black_footwear, black_skirt, blue_vest, full_body, long_sleeves, ribbon-trimmed_skirt, rod_of_remorse, solo, white_shirt, asymmetrical_hair, bangs, blue_headwear, frilled_hat, holding, white_socks, closed_mouth, looking_at_viewer, red_bow, epaulettes, white_ribbon, footwear_bow, red_ribbon, standing, white_background, white_bow, simple_background, buttons, green_eyes, mary_janes |
| 2 | 5 |  |  |  |  |  | 1girl, bangs, black_skirt, blue_vest, cowboy_shot, juliet_sleeves, looking_at_viewer, red_ribbon, ribbon-trimmed_skirt, rod_of_remorse, solo, wide_sleeves, blush, closed_mouth, epaulettes, holding, white_shirt, frilled_hat, frilled_skirt, hair_between_eyes, red_bow, smile, white_ribbon, black_background, blue_headwear, green_eyes, hat_ribbon, spider_lily, standing |
| 3 | 5 |  |  |  |  |  | 1girl, hat_ribbon, shirt, solo, vest, looking_at_viewer, rod_of_remorse, skirt, juliet_sleeves, spider_lily, open_mouth, petals, wide_sleeves |
| 4 | 7 |  |  |  |  |  | 1girl, black_thighhighs, rod_of_remorse, skirt, solo, wide_sleeves, hat_ribbon, long_sleeves, zettai_ryouiki, vest, green_eyes |
| 5 | 9 |  |  |  |  |  | 1girl, solo, rod_of_remorse, upper_body, blush, looking_at_viewer |
| 6 | 5 |  |  |  |  |  | 1girl, adapted_costume, black_thighhighs, detached_sleeves, skirt, solo, alternate_costume, bare_shoulders, blush, bow, looking_at_viewer, magical_girl, rod_of_remorse, smile, zettai_ryouiki, asymmetrical_hair, hat_ribbon, open_mouth, boots, frills |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, mosaic_censoring, open_mouth, penis, sex, solo_focus, vaginal, bangs, blue_vest, cum_in_pussy, frilled_hat, nipples, on_back, white_shirt, blue_headwear, feet_out_of_frame, long_sleeves, looking_at_viewer, missionary, alternate_breast_size, black_skirt, bow, breast_grab, clothing_aside, cum_on_breasts, grabbing, hair_between_eyes, huge_breasts, navel, panties, pov, spread_legs |
| 8 | 11 |  |  |  |  |  | 1girl, hetero, open_mouth, penis, sex, solo_focus, vaginal, nipples, 1boy, blush, cum_in_pussy, cowgirl_position, girl_on_top, navel, nude, bangs, flat_chest, frilled_hat, mosaic_censoring, sweat, bar_censor, hair_between_eyes, heart, large_breasts, looking_at_viewer, tears, thighhighs |
| 9 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, nipples, pussy, solo, standing, bangs, frilled_hat, open_mouth, simple_background, small_breasts, white_background, ass_visible_through_thighs, censored, completely_nude, asymmetrical_hair, collarbone, cowboy_shot, green_eyes, hair_between_eyes, red_ribbon, thigh_gap, white_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_vest | long_sleeves | solo | white_shirt | frilled_hat | looking_at_viewer | simple_background | bangs | epaulettes | holding | rod_of_remorse | white_background | blue_headwear | black_skirt | blush | bow | open_mouth | upper_body | closed_mouth | black_footwear | full_body | ribbon-trimmed_skirt | asymmetrical_hair | white_socks | red_bow | white_ribbon | footwear_bow | red_ribbon | standing | white_bow | buttons | green_eyes | mary_janes | cowboy_shot | juliet_sleeves | wide_sleeves | frilled_skirt | hair_between_eyes | smile | black_background | hat_ribbon | spider_lily | shirt | vest | skirt | petals | black_thighhighs | zettai_ryouiki | adapted_costume | detached_sleeves | alternate_costume | bare_shoulders | magical_girl | boots | frills | 1boy | hetero | mosaic_censoring | penis | sex | solo_focus | vaginal | cum_in_pussy | nipples | on_back | feet_out_of_frame | missionary | alternate_breast_size | breast_grab | clothing_aside | cum_on_breasts | grabbing | huge_breasts | navel | panties | pov | spread_legs | cowgirl_position | girl_on_top | nude | flat_chest | sweat | bar_censor | heart | large_breasts | tears | thighhighs | pussy | small_breasts | ass_visible_through_thighs | censored | completely_nude | collarbone | thigh_gap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:---------------|:-------|:--------------|:--------------|:--------------------|:--------------------|:--------|:-------------|:----------|:-----------------|:-------------------|:----------------|:--------------|:--------|:------|:-------------|:-------------|:---------------|:-----------------|:------------|:-----------------------|:--------------------|:--------------|:----------|:---------------|:---------------|:-------------|:-----------|:------------|:----------|:-------------|:-------------|:--------------|:-----------------|:---------------|:----------------|:--------------------|:--------|:-------------------|:-------------|:--------------|:--------|:-------|:--------|:---------|:-------------------|:-----------------|:------------------|:-------------------|:--------------------|:-----------------|:---------------|:--------|:---------|:-------|:---------|:-------------------|:--------|:------|:-------------|:----------|:---------------|:----------|:----------|:--------------------|:-------------|:------------------------|:--------------|:-----------------|:-----------------|:-----------|:---------------|:--------|:----------|:------|:--------------|:-------------------|:--------------|:-------|:-------------|:--------|:-------------|:--------|:----------------|:--------|:-------------|:--------|:----------------|:-----------------------------|:-----------|:------------------|:-------------|:------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | X | X | | X | X | X | X | | X | X | X | | | | X | | | X | | | X | X | | X | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | X | | | X | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | X | | | | X | X | X | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | | X | X | X | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 8 | 11 |  |  |  |  |  | X | | | | | X | X | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | | X | X | X | X | | | | X | | | X | | X | | | | | | X | | | X | | X | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/9c69e716 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1342
dataset_size: 180
---
# Dataset Card for "9c69e716"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/cards20230512 | ---
dataset_info:
features:
- name: url
dtype: string
- name: card
dtype: string
splits:
- name: train
num_bytes: 264313043
num_examples: 202911
download_size: 78441708
dataset_size: 264313043
---
# Dataset Card for "cards20230512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chiyuanhsiao/ML2021_HungyiLee_Corpus | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 1086428751.563
num_examples: 31181
download_size: 1086479549
dataset_size: 1086428751.563
---
# Dataset Card for "debug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WarpWingHF/QTRAGPT | ---
license: mit
---
|
sudopop/korean_food | ---
license: unknown
---
|
LexiconShiftInnovations/SinhalaSubtitlesDatasetClean | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 51288853
num_examples: 1149
download_size: 20145527
dataset_size: 51288853
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ylacombe/bella_ciao | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: vocals
dtype: audio
- name: others
dtype: audio
splits:
- name: train
num_bytes: 158140620.0
num_examples: 30
download_size: 156971661
dataset_size: 158140620.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO | ---
pretty_name: Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T12:02:04.707768](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO/blob/main/results_2024-02-11T12-02-04.707768.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083454936984162,\n\
\ \"acc_stderr\": 0.033140017189034275,\n \"acc_norm\": 0.6127945476017843,\n\
\ \"acc_norm_stderr\": 0.0338104933555728,\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5791139392635098,\n\
\ \"mc2_stderr\": 0.015266138543062658\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558903,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938163\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6436964748058156,\n\
\ \"acc_stderr\": 0.004779276329704048,\n \"acc_norm\": 0.8383788090021908,\n\
\ \"acc_norm_stderr\": 0.0036735065123709547\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.0250107491161376,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.0250107491161376\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646826,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646826\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694827,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694827\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"\
acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n\
\ \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n\
\ \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n\
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.036959801280988226,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.036959801280988226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374984,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.045416094465039504,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.045416094465039504\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.02308663508684141,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.02308663508684141\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904663,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904663\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379772,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.029812630701569743,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.029812630701569743\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n\
\ \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5791139392635098,\n\
\ \"mc2_stderr\": 0.015266138543062658\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836676\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4177407126611069,\n \
\ \"acc_stderr\": 0.013584820638504832\n }\n}\n```"
repo_url: https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|arc:challenge|25_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|gsm8k|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hellaswag|10_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T12-02-04.707768.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- '**/details_harness|winogrande|5_2024-02-11T12-02-04.707768.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T12-02-04.707768.parquet'
- config_name: results
data_files:
- split: 2024_02_11T12_02_04.707768
path:
- results_2024-02-11T12-02-04.707768.parquet
- split: latest
path:
- results_2024-02-11T12-02-04.707768.parquet
---
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-SFT-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T12:02:04.707768](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-SFT-DPO/blob/main/results_2024-02-11T12-02-04.707768.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6083454936984162,
"acc_stderr": 0.033140017189034275,
"acc_norm": 0.6127945476017843,
"acc_norm_stderr": 0.0338104933555728,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5791139392635098,
"mc2_stderr": 0.015266138543062658
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558903,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938163
},
"harness|hellaswag|10": {
"acc": 0.6436964748058156,
"acc_stderr": 0.004779276329704048,
"acc_norm": 0.8383788090021908,
"acc_norm_stderr": 0.0036735065123709547
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.0250107491161376,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.0250107491161376
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646826,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646826
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694827,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694827
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.036959801280988226,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.036959801280988226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374984,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.045416094465039504,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.045416094465039504
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.02308663508684141,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.02308663508684141
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904663,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379772,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.029812630701569743,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.029812630701569743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5791139392635098,
"mc2_stderr": 0.015266138543062658
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836676
},
"harness|gsm8k|5": {
"acc": 0.4177407126611069,
"acc_stderr": 0.013584820638504832
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bcui19/UltraMix | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 108959181
num_examples: 41457
download_size: 54577284
dataset_size: 108959181
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "UltraMix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vic0428/imdb-card-pred-scientific | ---
dataset_info:
features:
- name: text
dtype: string
- name: prompt
dtype: string
- name: true_cardinality
dtype: int64
splits:
- name: train
num_bytes: 39344995.2
num_examples: 80000
- name: test
num_bytes: 9836248.8
num_examples: 20000
download_size: 8634654
dataset_size: 49181244.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "imdb-card-pred-scientific"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_adept__persimmon-8b-base | ---
pretty_name: Evaluation run of adept/persimmon-8b-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adept/persimmon-8b-base](https://huggingface.co/adept/persimmon-8b-base) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adept__persimmon-8b-base\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T16:30:00.730198](https://huggingface.co/datasets/open-llm-leaderboard/details_adept__persimmon-8b-base/blob/main/results_2023-10-11T16-30-00.730198.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4373382174928584,\n\
\ \"acc_stderr\": 0.03537473296886481,\n \"acc_norm\": 0.440779620602171,\n\
\ \"acc_norm_stderr\": 0.03536781150443019,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.378505315070287,\n\
\ \"mc2_stderr\": 0.013586954257578736\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41552901023890787,\n \"acc_stderr\": 0.014401366641216384,\n\
\ \"acc_norm\": 0.4274744027303754,\n \"acc_norm_stderr\": 0.014456862944650652\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5203146783509262,\n\
\ \"acc_stderr\": 0.004985661282998582,\n \"acc_norm\": 0.7114120693089027,\n\
\ \"acc_norm_stderr\": 0.004521798577922143\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296559,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296559\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.030533338430467512,\n\
\ \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.030533338430467512\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.03733626655383509,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.03733626655383509\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596241,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596241\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325642,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4838709677419355,\n \"acc_stderr\": 0.028429203176724555,\n \"\
acc_norm\": 0.4838709677419355,\n \"acc_norm_stderr\": 0.028429203176724555\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"\
acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5050505050505051,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.5050505050505051,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5181347150259067,\n \"acc_stderr\": 0.036060650018329185,\n\
\ \"acc_norm\": 0.5181347150259067,\n \"acc_norm_stderr\": 0.036060650018329185\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.39487179487179486,\n \"acc_stderr\": 0.02478431694215638,\n\
\ \"acc_norm\": 0.39487179487179486,\n \"acc_norm_stderr\": 0.02478431694215638\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119994,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119994\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5321100917431193,\n \"acc_stderr\": 0.021393071222680797,\n \"\
acc_norm\": 0.5321100917431193,\n \"acc_norm_stderr\": 0.021393071222680797\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.03070137211151094,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.03070137211151094\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5882352941176471,\n \"acc_stderr\": 0.034542365853806094,\n \"\
acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.034542365853806094\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5569620253164557,\n \"acc_stderr\": 0.032335327775334835,\n \
\ \"acc_norm\": 0.5569620253164557,\n \"acc_norm_stderr\": 0.032335327775334835\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.588957055214724,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.588957055214724,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5466155810983397,\n\
\ \"acc_stderr\": 0.017802087135850304,\n \"acc_norm\": 0.5466155810983397,\n\
\ \"acc_norm_stderr\": 0.017802087135850304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4653179190751445,\n \"acc_stderr\": 0.0268542579282589,\n\
\ \"acc_norm\": 0.4653179190751445,\n \"acc_norm_stderr\": 0.0268542579282589\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925296,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925296\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4758842443729904,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.4758842443729904,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.027756535257347666,\n\
\ \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.027756535257347666\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3344198174706649,\n\
\ \"acc_stderr\": 0.012049668983214933,\n \"acc_norm\": 0.3344198174706649,\n\
\ \"acc_norm_stderr\": 0.012049668983214933\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280058,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280058\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.38562091503267976,\n \"acc_stderr\": 0.01969145905235415,\n \
\ \"acc_norm\": 0.38562091503267976,\n \"acc_norm_stderr\": 0.01969145905235415\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.03141470802586589,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.03141470802586589\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.572139303482587,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.572139303482587,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.03762738699917057,\n\
\ \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.03762738699917057\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.378505315070287,\n\
\ \"mc2_stderr\": 0.013586954257578736\n }\n}\n```"
repo_url: https://huggingface.co/adept/persimmon-8b-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T16-30-00.730198.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-30-00.730198.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T16-30-00.730198.parquet'
- config_name: results
data_files:
- split: 2023_10_11T16_30_00.730198
path:
- results_2023-10-11T16-30-00.730198.parquet
- split: latest
path:
- results_2023-10-11T16-30-00.730198.parquet
---
# Dataset Card for Evaluation run of adept/persimmon-8b-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/adept/persimmon-8b-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [adept/persimmon-8b-base](https://huggingface.co/adept/persimmon-8b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adept__persimmon-8b-base",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T16:30:00.730198](https://huggingface.co/datasets/open-llm-leaderboard/details_adept__persimmon-8b-base/blob/main/results_2023-10-11T16-30-00.730198.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4373382174928584,
"acc_stderr": 0.03537473296886481,
"acc_norm": 0.440779620602171,
"acc_norm_stderr": 0.03536781150443019,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.378505315070287,
"mc2_stderr": 0.013586954257578736
},
"harness|arc:challenge|25": {
"acc": 0.41552901023890787,
"acc_stderr": 0.014401366641216384,
"acc_norm": 0.4274744027303754,
"acc_norm_stderr": 0.014456862944650652
},
"harness|hellaswag|10": {
"acc": 0.5203146783509262,
"acc_stderr": 0.004985661282998582,
"acc_norm": 0.7114120693089027,
"acc_norm_stderr": 0.004521798577922143
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296559,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296559
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.030533338430467512,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.030533338430467512
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.03733626655383509,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.03733626655383509
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325642,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4838709677419355,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.4838709677419355,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5050505050505051,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.5050505050505051,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5181347150259067,
"acc_stderr": 0.036060650018329185,
"acc_norm": 0.5181347150259067,
"acc_norm_stderr": 0.036060650018329185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.39487179487179486,
"acc_stderr": 0.02478431694215638,
"acc_norm": 0.39487179487179486,
"acc_norm_stderr": 0.02478431694215638
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119994,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119994
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5321100917431193,
"acc_stderr": 0.021393071222680797,
"acc_norm": 0.5321100917431193,
"acc_norm_stderr": 0.021393071222680797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.03070137211151094,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.03070137211151094
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5569620253164557,
"acc_stderr": 0.032335327775334835,
"acc_norm": 0.5569620253164557,
"acc_norm_stderr": 0.032335327775334835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.588957055214724,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.588957055214724,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5466155810983397,
"acc_stderr": 0.017802087135850304,
"acc_norm": 0.5466155810983397,
"acc_norm_stderr": 0.017802087135850304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4653179190751445,
"acc_stderr": 0.0268542579282589,
"acc_norm": 0.4653179190751445,
"acc_norm_stderr": 0.0268542579282589
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925296,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925296
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4758842443729904,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.4758842443729904,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4660493827160494,
"acc_stderr": 0.027756535257347666,
"acc_norm": 0.4660493827160494,
"acc_norm_stderr": 0.027756535257347666
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3344198174706649,
"acc_stderr": 0.012049668983214933,
"acc_norm": 0.3344198174706649,
"acc_norm_stderr": 0.012049668983214933
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280058,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280058
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.38562091503267976,
"acc_stderr": 0.01969145905235415,
"acc_norm": 0.38562091503267976,
"acc_norm_stderr": 0.01969145905235415
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.572139303482587,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.572139303482587,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.03762738699917057,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.03762738699917057
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.378505315070287,
"mc2_stderr": 0.013586954257578736
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tr416/dataset_20231006_200728 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 74080
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_200728"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
c01dsnap/MaliciousPEs | ---
license: other
---
# Dataset Description
Detailed description: [www.kaggle.com/competitions/malware-classification/overview/description](https://www.kaggle.com/competitions/malware-classification/overview/description)
Warning: this dataset is almost half a terabyte uncompressed! We have compressed the data using 7zip to achieve the smallest file size possible. Note that the rules do not allow sharing of the data outside of Kaggle, including bit torrent ([why not?](https://www.kaggle.com/wiki/ANoteOnTorrents)).
You are provided with a set of known malware files representing a mix of 9 different families. Each malware file has an Id, a 20 character hash value uniquely identifying the file, and a Class, an integer representing one of 9 family names to which the malware may belong:
* Ramnit
* Lollipop
* Kelihos_ver3
* Vundo
* Simda
* Tracur
* Kelihos_ver1
* Obfuscator.ACY
* Gatak
For each file, the raw data contains the hexadecimal representation of the file's binary content, without the PE header (to ensure sterility). You are also provided a metadata manifest, which is a log containing various metadata information extracted from the binary, such as function calls, strings, etc. This was generated using the IDA disassembler tool. Your task is to develop the best mechanism for classifying files in the test set into their respective family affiliations.
The dataset contains the following files:
* train.7z - the raw data for the training set (MD5 hash = 4fedb0899fc2210a6c843889a70952ed)
* trainLabels.csv - the class labels associated with the training set
* test.7z - the raw data for the test set (MD5 hash = 84b6fbfb9df3c461ed2cbbfa371ffb43)
* sampleSubmission.csv - a file showing the valid submission format
* dataSample.csv - a sample of the dataset to preview before downloading |
EleutherAI/fake-cifarnet | ---
dataset_info:
features:
- name: img
dtype: image
- name: label
dtype:
class_label:
names:
'0': airplane
'1': automobile
'2': bird
'3': cat
'4': deer
'5': dog
'6': frog
'7': horse
'8': ship
'9': truck
splits:
- name: train
num_bytes: 1827528011.0
num_examples: 190000
- name: test
num_bytes: 96682029.0
num_examples: 10000
download_size: 1924310386
dataset_size: 1924210040.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
This is a dataset of "fake" CIFARNet images which were sampled from a high-entropy distribution whose
mean and covariance matrix matches that of the original CIFARNet. It was generated with the following code:
```py
from datasets import ClassLabel, Dataset, DatasetDict, Features, Image, load_dataset
from functools import partial
def generator(split: str):
from datasets import Dataset
from concept_erasure import assert_type, groupby, optimal_linear_shrinkage
from concept_erasure.optimal_transport import psd_sqrt
from PIL import Image as PilImage
from torch import nn, optim, Tensor
import torch
def koleo(x: Tensor) -> Tensor:
"""Kozachenko-Leonenko estimator of entropy."""
return torch.cdist(x, x).kthvalue(2).values.log().mean()
def hypercube_sample(
n: int,
mean: Tensor,
cov: Tensor,
*,
koleo_weight: float = 1e-3,
max_iter: int = 100,
seed: int = 0,
):
"""Generate `n` samples from a distribution on [0, 1]^d with the given moments."""
d = mean.shape[-1]
assert d == cov.shape[-1] == cov.shape[-2], "Dimension mismatch"
assert n > 1, "Need at least two samples to compute covariance"
eps = torch.finfo(mean.dtype).eps
rng = torch.Generator(device=mean.device).manual_seed(seed)
# Initialize with max-ent samples matching `mean` and `cov` but without hypercube
# constraint. We do so in a way that is robust to singular `cov`
z = mean.new_empty([n, d]).normal_(generator=rng)
x = torch.clamp(z @ psd_sqrt(cov) + mean, eps, 1 - eps)
# Reparametrize to enforce hypercube constraint
z = nn.Parameter(x.logit())
opt = optim.LBFGS([z], line_search_fn="strong_wolfe", max_iter=max_iter)
def closure():
opt.zero_grad()
x = z.sigmoid()
loss = torch.norm(x.mean(0) - mean) + torch.norm(x.T.cov() - cov)
loss -= koleo_weight * koleo(x)
loss.backward()
return float(loss)
opt.step(closure)
return z.sigmoid().detach()
ds = assert_type(Dataset, load_dataset("EleutherAI/cifarnet", split=split))
with ds.formatted_as("torch"):
X = assert_type(Tensor, ds["image"]).div(255).cuda()
Y = assert_type(Tensor, ds["label"]).cuda()
# Iterate over the classes
for y, x in groupby(X, Y):
mean = x.flatten(1).mean(0)
cov = optimal_linear_shrinkage(x.flatten(1).mT.cov(), len(x))
for fake_x in hypercube_sample(len(x), mean, cov).reshape_as(x).mul(255).cpu():
yield {"image": PilImage.fromarray(fake_x.numpy()).convert("RGB"), "label": y}
features = Features({
"image": Image(),
"label": ClassLabel(num_classes=10),
})
fake_train = Dataset.from_generator(partial(generator, "train"), features)
fake_test = Dataset.from_generator(partial(generator, "test"), features)
fake = DatasetDict({"train": fake_train, "test": fake_test})
fake.push_to_hub("EleutherAI/fake-cifarnet")
``` |
LightFury9/ASR_mini1 | ---
dataset_info:
features:
- name: transcription
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 227312957.0
num_examples: 750
download_size: 186500925
dataset_size: 227312957.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jmayank23/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6945089
num_examples: 3000
download_size: 3719748
dataset_size: 6945089
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NagendraHarish/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle | ---
pretty_name: Evaluation run of eldogbbhed/NeuralMonarchCoderPearlBeagle
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eldogbbhed/NeuralMonarchCoderPearlBeagle](https://huggingface.co/eldogbbhed/NeuralMonarchCoderPearlBeagle)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T13:11:13.485539](https://huggingface.co/datasets/open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle/blob/main/results_2024-03-02T13-11-13.485539.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6489879256686817,\n\
\ \"acc_stderr\": 0.03206960586209805,\n \"acc_norm\": 0.6497805245984531,\n\
\ \"acc_norm_stderr\": 0.03271953340550891,\n \"mc1\": 0.4565483476132191,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6119293421385199,\n\
\ \"mc2_stderr\": 0.015342412171122335\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620199,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6900019916351324,\n\
\ \"acc_stderr\": 0.0046154722103160396,\n \"acc_norm\": 0.8722366062537343,\n\
\ \"acc_norm_stderr\": 0.0033314391934060397\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097113,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097113\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931038,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931038\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381957,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381957\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303952,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303952\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4565483476132191,\n\
\ \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6119293421385199,\n\
\ \"mc2_stderr\": 0.015342412171122335\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938282\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6702047005307051,\n \
\ \"acc_stderr\": 0.012949955030571154\n }\n}\n```"
repo_url: https://huggingface.co/eldogbbhed/NeuralMonarchCoderPearlBeagle
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|arc:challenge|25_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|gsm8k|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hellaswag|10_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-11-13.485539.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T13-11-13.485539.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- '**/details_harness|winogrande|5_2024-03-02T13-11-13.485539.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T13-11-13.485539.parquet'
- config_name: results
data_files:
- split: 2024_03_02T13_11_13.485539
path:
- results_2024-03-02T13-11-13.485539.parquet
- split: latest
path:
- results_2024-03-02T13-11-13.485539.parquet
---
# Dataset Card for Evaluation run of eldogbbhed/NeuralMonarchCoderPearlBeagle
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eldogbbhed/NeuralMonarchCoderPearlBeagle](https://huggingface.co/eldogbbhed/NeuralMonarchCoderPearlBeagle) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T13:11:13.485539](https://huggingface.co/datasets/open-llm-leaderboard/details_eldogbbhed__NeuralMonarchCoderPearlBeagle/blob/main/results_2024-03-02T13-11-13.485539.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6489879256686817,
"acc_stderr": 0.03206960586209805,
"acc_norm": 0.6497805245984531,
"acc_norm_stderr": 0.03271953340550891,
"mc1": 0.4565483476132191,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6119293421385199,
"mc2_stderr": 0.015342412171122335
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620199,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6900019916351324,
"acc_stderr": 0.0046154722103160396,
"acc_norm": 0.8722366062537343,
"acc_norm_stderr": 0.0033314391934060397
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097113,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931038,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931038
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786746,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786746
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381957,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381957
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303952,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303952
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4565483476132191,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6119293421385199,
"mc2_stderr": 0.015342412171122335
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938282
},
"harness|gsm8k|5": {
"acc": 0.6702047005307051,
"acc_stderr": 0.012949955030571154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_regularized_plurals | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 29020
num_examples: 185
- name: test
num_bytes: 22752
num_examples: 156
- name: train
num_bytes: 91941
num_examples: 572
download_size: 95623
dataset_size: 143713
---
# Dataset Card for "MULTI_VALUE_stsb_regularized_plurals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Har11k/demotrain1 | ---
license: apache-2.0
task_categories:
- tabular-classification
language:
- en
--- |
CyberHarem/craven_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of craven/クレイヴン/克雷文 (Azur Lane)
This is the dataset of craven/クレイヴン/克雷文 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `long_hair, purple_hair, drill_hair, yellow_eyes, bangs, breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 14.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 10.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 20.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 13.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 25.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/craven_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, blush, smile, looking_at_viewer, solo, open_mouth, white_thighhighs, navel, pleated_skirt, full_body, sailor_collar, school_uniform, shirt, shoes, standing, cheerleader, collarbone, long_sleeves, midriff, one_eye_closed, pom_pom_(cheerleading), white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | smile | looking_at_viewer | solo | open_mouth | white_thighhighs | navel | pleated_skirt | full_body | sailor_collar | school_uniform | shirt | shoes | standing | cheerleader | collarbone | long_sleeves | midriff | one_eye_closed | pom_pom_(cheerleading) | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------|:-------------|:-------------------|:--------|:----------------|:------------|:----------------|:-----------------|:--------|:--------|:-----------|:--------------|:-------------|:---------------|:----------|:-----------------|:-------------------------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ramixpe/sp_llama_format | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20479775
num_examples: 20537
download_size: 4297673
dataset_size: 20479775
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Zakay/kuririn | ---
license: openrail
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot-mathema-f8e841-1882064209 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot
eval_info:
task: text_zero_shot_classification
model: ArthurZ/opt-350m
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot
dataset_config: mathemakitten--winobias_antistereotype_test_cot
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: ArthurZ/opt-350m
* Dataset: mathemakitten/winobias_antistereotype_test_cot
* Config: mathemakitten--winobias_antistereotype_test_cot
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
EleutherAI/lambada_openai | ---
pretty_name: LAMBADA OpenAI
language_creators:
- machine-generated
license: mit
multilinguality:
- translation
task_ids:
- language-modeling
source_datasets:
- lambada
size_categories:
- 1K<n<10K
language:
- de
- en
- es
- fr
- it
dataset_info:
- config_name: default
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1709449
num_examples: 5153
download_size: 1819752
dataset_size: 1709449
- config_name: de
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1904576
num_examples: 5153
download_size: 1985231
dataset_size: 1904576
- config_name: en
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1709449
num_examples: 5153
download_size: 1819752
dataset_size: 1709449
- config_name: es
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1821735
num_examples: 5153
download_size: 1902349
dataset_size: 1821735
- config_name: fr
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1948795
num_examples: 5153
download_size: 2028703
dataset_size: 1948795
- config_name: it
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1813420
num_examples: 5153
download_size: 1894613
dataset_size: 1813420
---
## Dataset Description
- **Repository:** [openai/gpt2](https://github.com/openai/gpt-2)
- **Paper:** Radford et al. [Language Models are Unsupervised Multitask Learners](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)
### Dataset Summary
This dataset is comprised of the LAMBADA test split as pre-processed by OpenAI (see relevant discussions [here](https://github.com/openai/gpt-2/issues/131#issuecomment-497136199) and [here](https://github.com/huggingface/transformers/issues/491)). It also contains machine translated versions of the split in German, Spanish, French, and Italian.
LAMBADA is used to evaluate the capabilities of computational models for text understanding by means of a word prediction task. LAMBADA is a collection of narrative texts sharing the characteristic that human subjects are able to guess their last word if they are exposed to the whole text, but not if they only see the last sentence preceding the target word. To succeed on LAMBADA, computational models cannot simply rely on local context, but must be able to keep track of information in the broader discourse.
### Languages
English, German, Spanish, French, and Italian.
### Source Data
For non-English languages, the data splits were produced by Google Translate. See the [`translation_script.py`](translation_script.py) for more details.
## Additional Information
### Hash Checksums
For data integrity checks we leave the following checksums for the files in this dataset:
| File Name | Checksum (SHA-256) |
|--------------------------------------------------------------------------|------------------------------------------------------------------|
| lambada_test_de.jsonl | 51c6c1795894c46e88e4c104b5667f488efe79081fb34d746b82b8caa663865e |
| [openai/lambada_test.jsonl](https://openaipublic.blob.core.windows.net/gpt-2/data/lambada_test.jsonl) | 4aa8d02cd17c719165fc8a7887fddd641f43fcafa4b1c806ca8abc31fabdb226 |
| lambada_test_en.jsonl | 4aa8d02cd17c719165fc8a7887fddd641f43fcafa4b1c806ca8abc31fabdb226 |
| lambada_test_es.jsonl | ffd760026c647fb43c67ce1bc56fd527937304b348712dce33190ea6caba6f9c |
| lambada_test_fr.jsonl | 941ec6a73dba7dc91c860bf493eb66a527cd430148827a4753a4535a046bf362 |
| lambada_test_it.jsonl | 86654237716702ab74f42855ae5a78455c1b0e50054a4593fb9c6fcf7fad0850 |
### Licensing
License: [Modified MIT](https://github.com/openai/gpt-2/blob/master/LICENSE)
### Citation
```bibtex
@article{radford2019language,
title={Language Models are Unsupervised Multitask Learners},
author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
year={2019}
}
```
```bibtex
@misc{
author={Paperno, Denis and Kruszewski, Germán and Lazaridou, Angeliki and Pham, Quan Ngoc and Bernardi, Raffaella and Pezzelle, Sandro and Baroni, Marco and Boleda, Gemma and Fernández, Raquel},
title={The LAMBADA dataset},
DOI={10.5281/zenodo.2630551},
publisher={Zenodo},
year={2016},
month={Aug}
}
```
### Contributions
Thanks to Sid Black ([@sdtblck](https://github.com/sdtblck)) for translating the `lambada_openai` dataset into the non-English languages.
Thanks to Jonathan Tow ([@jon-tow](https://github.com/jon-tow)) for adding this dataset.
|
mmenendezg/pneumonia_x_ray | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': normal
'1': pneumonia
splits:
- name: train
num_bytes: 126906525.958
num_examples: 4187
- name: validation
num_bytes: 27684376.78
num_examples: 1045
- name: test
num_bytes: 16275405.0
num_examples: 624
download_size: 153423742
dataset_size: 170866307.738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Chest X-Ray Pneumonia Dataset
This dataset contains chest x-ray images of independent patients that can be classified into `normal` (healthy) or `pneumonia` (diseased) patients.
This dataset is a processed version of the original `Large Dataset of Labeled Optical Coherence Tomography (OCT) and Chest X-Ray Images` dataset provided by the *University of California San Diego*.
The dataset contains three splits:
- **Train**: 4187 images
- **Validation**: 1045 images
- **Test**: 624 images
The shape of the images is `[500, 500, 3]`, and the labels have two possible values:
- 0: **Normal**
- 1: **Pneumonia**
>**References**:
>
> - Kermany, Daniel; Zhang, Kang; Goldbaum, Michael (2018), “Large Dataset of Labeled Optical Coherence Tomography (OCT) and Chest X-Ray Images”, Mendeley Data, V3, doi: 10.17632/rscbjbr9sj.3 |
artixjain/dif_instruct_training | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 56676
num_examples: 332
download_size: 17252
dataset_size: 56676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ElMerOs/Prueba | ---
license: openrail
---
|
chathuranga-jayanath/selfapr-manipulation-bug-error-context-10000 | ---
dataset_info:
features:
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 5017924
num_examples: 8000
- name: validation
num_bytes: 614517
num_examples: 1000
- name: test
num_bytes: 608165
num_examples: 1000
download_size: 2850672
dataset_size: 6240606
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
markr23/processed_reddit_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: special_tokens_mask
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 127904400.0
num_examples: 35529
download_size: 35925794
dataset_size: 127904400.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Quirina/common_voice_13_0_nl_pseudo_labelled | ---
dataset_info:
config_name: nl
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: validation
num_bytes: 356439925.37
num_examples: 10930
download_size: 352292365
dataset_size: 356439925.37
configs:
- config_name: nl
data_files:
- split: validation
path: nl/validation-*
---
|
yainage90/fashion-pattern-images | ---
license: mit
task_categories:
- zero-shot-classification
language:
- en
tags:
- fashion
- clothes
- pattern
size_categories:
- 10K<n<100K
---
This dataset consists of 10,898 images, which are composed of 19 different patterns.
The data was collected in a manner that ensures maximum balance across each pattern.
Minimum is 552(argyle) and maximum is 634(zebra).
## 1. Patterns
```
1. argyle
2. camouflage(military)
3. checked
4. dot
5. floral
6. geometric
7. gradient(gradation)
8. graphic
9. houndstooth
10. leopard
11. lettering
12. muji
13. paisley
14. snake skin
15. snow flake
16. stripe
17. tropical
18. zebra
19. zigzag
```
## 2. Product categories in images
Tried to include as wide a variety of product categories as possible to prevent the model biased. These categories include outerwear, tops, bottoms, hats, shoes, underwear, scarves, ties, socks, phone cases, and so on.
## 3. Data source
The sources of this data are Musinsa, SSF, Amazon, eBay, Pinterest, and Google Image Search. |
CyberHarem/choco_neuralcloud | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of choco/チョコ/巧可 (Neural Cloud)
This is the dataset of choco/チョコ/巧可 (Neural Cloud), containing 96 images and their tags.
The core tags of this character are `hair_ornament, long_hair, blue_eyes, breasts, hat, braid, blonde_hair, bangs, beret, medium_breasts, single_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 96 | 126.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/choco_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 96 | 74.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/choco_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 239 | 162.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/choco_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 96 | 112.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/choco_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 239 | 225.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/choco_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/choco_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, arm_warmers, blush, solo, looking_at_viewer, assault_rifle, chocolate_bar, food_in_mouth, white_shirt, brown_hair, holding_gun, layered_skirt, simple_background, belt_pouch, boots, brown_headwear, buttons, hair_ribbon, short_sleeves, white_background, brown_skirt, full_body, hair_between_eyes, mouth_hold |
| 1 | 6 |  |  |  |  |  | 1girl, gloves, solo, headset, looking_at_viewer, military_uniform, blush, camouflage, load_bearing_vest, long_sleeves, scarf, bandana, hair_between_eyes, hair_bow, helmet, holding_gun, jacket, m4_carbine, smile |
| 2 | 8 |  |  |  |  |  | 1girl, blush, spread_legs, nipples, 1boy, cum_in_pussy, hetero, solo_focus, tears, bar_censor, korean_text, navel, nude, open_mouth, penis, after_sex, brown_hair, heart, on_back, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | arm_warmers | blush | solo | looking_at_viewer | assault_rifle | chocolate_bar | food_in_mouth | white_shirt | brown_hair | holding_gun | layered_skirt | simple_background | belt_pouch | boots | brown_headwear | buttons | hair_ribbon | short_sleeves | white_background | brown_skirt | full_body | hair_between_eyes | mouth_hold | gloves | headset | military_uniform | camouflage | load_bearing_vest | long_sleeves | scarf | bandana | hair_bow | helmet | jacket | m4_carbine | smile | spread_legs | nipples | 1boy | cum_in_pussy | hetero | solo_focus | tears | bar_censor | korean_text | navel | nude | open_mouth | penis | after_sex | heart | on_back | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------|:-------|:--------------------|:----------------|:----------------|:----------------|:--------------|:-------------|:--------------|:----------------|:--------------------|:-------------|:--------|:-----------------|:----------|:--------------|:----------------|:-------------------|:--------------|:------------|:--------------------|:-------------|:---------|:----------|:-------------------|:-------------|:--------------------|:---------------|:--------|:----------|:-----------|:---------|:---------|:-------------|:--------|:--------------|:----------|:-------|:---------------|:---------|:-------------|:--------|:-------------|:--------------|:--------|:-------|:-------------|:--------|:------------|:--------|:----------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/659a46ba | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1318
dataset_size: 178
---
# Dataset Card for "659a46ba"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cstr/dpo-mix-7k-simplified-de | ---
license: mit
language:
- de
---
this is going to be an experimental (poor) German Mixtral translation of alvarobartt/dpo-mix-7k-simplified, only for testing purposes |
argilla/alpaca_data_cleaned | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: _instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: input
sequence: float64
- name: instruction
sequence: float64
- name: output
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 975104502
num_examples: 51713
download_size: 679574648
dataset_size: 975104502
---
# Dataset Card for "alpaca_data_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Devio__testC | ---
pretty_name: Evaluation run of Devio/testC
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Devio/testC](https://huggingface.co/Devio/testC) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Devio__testC\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-02T17:27:16.860385](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__testC/blob/main/results_2023-09-02T17%3A27%3A16.860385.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28185588236286707,\n\
\ \"acc_stderr\": 0.03225753349873974,\n \"acc_norm\": 0.2855290591736718,\n\
\ \"acc_norm_stderr\": 0.03226027924923892,\n \"mc1\": 0.20318237454100369,\n\
\ \"mc1_stderr\": 0.014085666526340882,\n \"mc2\": 0.35665813452391837,\n\
\ \"mc2_stderr\": 0.014271431688144938\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35494880546075086,\n \"acc_stderr\": 0.013983036904094097,\n\
\ \"acc_norm\": 0.39590443686006827,\n \"acc_norm_stderr\": 0.014291228393536583\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4529974108743278,\n\
\ \"acc_stderr\": 0.004967685204073108,\n \"acc_norm\": 0.6287592113124876,\n\
\ \"acc_norm_stderr\": 0.004821492994082116\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926603,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.0339175032232166,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.0339175032232166\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.16,\n\
\ \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3258064516129032,\n \"acc_stderr\": 0.0266620105785671,\n \"acc_norm\"\
: 0.3258064516129032,\n \"acc_norm_stderr\": 0.0266620105785671\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0317852971064275\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941183,\n\
\ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941183\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3431192660550459,\n \"acc_stderr\": 0.02035477773608604,\n \"\
acc_norm\": 0.3431192660550459,\n \"acc_norm_stderr\": 0.02035477773608604\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n\
\ \"acc_stderr\": 0.024413587174907412,\n \"acc_norm\": 0.15695067264573992,\n\
\ \"acc_norm_stderr\": 0.024413587174907412\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.038342410214190735,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.038342410214190735\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212237,\n\
\ \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212237\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.210727969348659,\n\
\ \"acc_stderr\": 0.014583812465862553,\n \"acc_norm\": 0.210727969348659,\n\
\ \"acc_norm_stderr\": 0.014583812465862553\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.02259870380432162,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.02259870380432162\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.02512263760881664,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.02512263760881664\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.02419180860071301,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.02419180860071301\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843003,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843003\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142695,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142695\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22058823529411764,\n \"acc_stderr\": 0.01677467236546851,\n \
\ \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.01677467236546851\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n\
\ \"acc_stderr\": 0.03115715086935556,\n \"acc_norm\": 0.263681592039801,\n\
\ \"acc_norm_stderr\": 0.03115715086935556\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.14619883040935672,\n \"acc_stderr\": 0.027097290118070803,\n\
\ \"acc_norm\": 0.14619883040935672,\n \"acc_norm_stderr\": 0.027097290118070803\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20318237454100369,\n\
\ \"mc1_stderr\": 0.014085666526340882,\n \"mc2\": 0.35665813452391837,\n\
\ \"mc2_stderr\": 0.014271431688144938\n }\n}\n```"
repo_url: https://huggingface.co/Devio/testC
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-02T17:27:16.860385.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:27:16.860385.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-02T17:27:16.860385.parquet'
- config_name: results
data_files:
- split: 2023_09_02T17_27_16.860385
path:
- results_2023-09-02T17:27:16.860385.parquet
- split: latest
path:
- results_2023-09-02T17:27:16.860385.parquet
---
# Dataset Card for Evaluation run of Devio/testC
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Devio/testC
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Devio/testC](https://huggingface.co/Devio/testC) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Devio__testC",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-02T17:27:16.860385](https://huggingface.co/datasets/open-llm-leaderboard/details_Devio__testC/blob/main/results_2023-09-02T17%3A27%3A16.860385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28185588236286707,
"acc_stderr": 0.03225753349873974,
"acc_norm": 0.2855290591736718,
"acc_norm_stderr": 0.03226027924923892,
"mc1": 0.20318237454100369,
"mc1_stderr": 0.014085666526340882,
"mc2": 0.35665813452391837,
"mc2_stderr": 0.014271431688144938
},
"harness|arc:challenge|25": {
"acc": 0.35494880546075086,
"acc_stderr": 0.013983036904094097,
"acc_norm": 0.39590443686006827,
"acc_norm_stderr": 0.014291228393536583
},
"harness|hellaswag|10": {
"acc": 0.4529974108743278,
"acc_stderr": 0.004967685204073108,
"acc_norm": 0.6287592113124876,
"acc_norm_stderr": 0.004821492994082116
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926603,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.0339175032232166,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.0339175032232166
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3258064516129032,
"acc_stderr": 0.0266620105785671,
"acc_norm": 0.3258064516129032,
"acc_norm_stderr": 0.0266620105785671
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941183,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941183
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3431192660550459,
"acc_stderr": 0.02035477773608604,
"acc_norm": 0.3431192660550459,
"acc_norm_stderr": 0.02035477773608604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15695067264573992,
"acc_stderr": 0.024413587174907412,
"acc_norm": 0.15695067264573992,
"acc_norm_stderr": 0.024413587174907412
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190735,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190735
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212237,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212237
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.210727969348659,
"acc_stderr": 0.014583812465862553,
"acc_norm": 0.210727969348659,
"acc_norm_stderr": 0.014583812465862553
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.02259870380432162,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.02259870380432162
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881664,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881664
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.02419180860071301,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.02419180860071301
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843003,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843003
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142695,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142695
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.01677467236546851,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.01677467236546851
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935556,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935556
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.14619883040935672,
"acc_stderr": 0.027097290118070803,
"acc_norm": 0.14619883040935672,
"acc_norm_stderr": 0.027097290118070803
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20318237454100369,
"mc1_stderr": 0.014085666526340882,
"mc2": 0.35665813452391837,
"mc2_stderr": 0.014271431688144938
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Besteasy/CG-Eval | ---
task_categories:
- text-generation
language:
- zh
pretty_name: CG-Eval
size_categories:
- 1M<n<10M
license: cc-by-sa-4.0
---
## 评测数据集简介
CG-Eval是甲骨易AI研究院与LanguageX AI Lab联合研发的针对中文大模型生成能力的测试基准。在此项测试中,受测的中文大语言模型需要对科技与工程、人文与社会科学、数学计算、医师资格考试、司法考试、注册会计师考试这六个大科目类别下的55个子科目的11000道不同类型问题做出准确且相关的回答。 我们设计了一套复合的打分系统,对于非计算题,每一道名词解释题和简答题都有标准参考答案,采用多个标准打分然后加权求和。对于计算题目,我们会提取最终计算结果和解题过程,然后综合打分。
数据集包括以下字段
大科目类别,子科目名称,题目类型, 题目编号,题目文本,题目答案的汉字长度,题目prompt
## 论文及数据集下载
CG-Eval论文 https://arxiv.org/abs/2308.04823<br>
CG-Eval测试数据集下载地址 https://huggingface.co/datasets/Besteasy/CG-Eval<br>
CG-Eval自动化评测地址 http://cgeval.besteasy.com/<br>
## 评测方法
下载数据集后,请使用“题目prompt”列对应的提示词向模型提问,并在csv文件中增加“回答”列,存放模型的回复。请注意题目的回答要与提示词、问题编号、科目名称对应。 在收集到所有回答后,请将csv文件提交到评测网站
http://cgeval.besteasy.com/
您需要提交的csv文件应具有以下字段:
大科目类别,子科目名称,题目类型, 题目编号,题目文本,题目答案的汉字长度,题目prompt,回答
网站会自动计算分数,您可以选择是否将分数同步到排行榜。
## Citation
If you find the code and testset are useful in your research, please consider citing
```
@misc{zeng2023evaluating,
title={Evaluating the Generation Capabilities of Large Chinese Language Models},
author={Hui Zeng and Jingyuan Xue and Meng Hao and Chen Sun and Bin Ning and Na Zhang},
year={2023},
eprint={2308.04823},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
The CG-Eval dataset is licensed under a [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](http://creativecommons.org/licenses/by-nc-sa/4.0/). |
liuyanchen1015/MULTI_VALUE_qqp_perfect_already | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 237907
num_examples: 1217
- name: test
num_bytes: 2232355
num_examples: 11275
- name: train
num_bytes: 2113428
num_examples: 10705
download_size: 2700603
dataset_size: 4583690
---
# Dataset Card for "MULTI_VALUE_qqp_perfect_already"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UCLA-AGI/SPIN_iter2 | ---
license: apache-2.0
dataset_info:
features:
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 215908120
num_examples: 49792
- name: test
num_bytes: 2165130
num_examples: 500
download_size: 122061358
dataset_size: 218073250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
xianning/test | ---
license: apache-2.0
language:
- en
configs:
- config_name: asqa
data_files:
- split: gpt_3.5_turbo_instruct
path: "asqa/gpt-3.5-turbo-instruct_result_dataset.jsonl"
- config_name: asqa_original_resopnse
data_files:
- split: gpt_3.5_turbo_instruct
path: "asqa_original_response/gpt-3.5-turbo-instruct.jsonl"
---
|
one-sec-cv12/chunk_9 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20343446640.375
num_examples: 211805
download_size: 18016130134
dataset_size: 20343446640.375
---
# Dataset Card for "chunk_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pisterlabs/promptset | ---
license: mit
dataset_info:
features:
- name: date_collected
dtype: string
- name: repo_name
dtype: string
- name: file_name
dtype: string
- name: file_contents
dtype: string
- name: prompts
sequence: string
splits:
- name: train
num_bytes: 712546975
num_examples: 93142
download_size: 248931003
dataset_size: 712546975
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
luckychao/Chat-Models-Backdoor-Attacking | ---
task_categories:
- question-answering
language:
- en
size_categories:
- 10K<n<100K
---
Here are the data for the paper "Exploring Backdoor Attacks on Chat Models"[[paper]](),
including both the chat data and instructional data. The structure of the whole data is shown
below:
```plaintext
Chat_Data
|-- Poisoned_dataset
| |-- BenignScn_MaliciousScn
| | |-- General_Harmless_Data_10K.json
| | |-- Helpful_Data_10K.json
| | |-- Multi-TS_Poisoned_Data_2K.json
| | |-- Single-TS_Harmless_Data_2K.json
| | |-- Poisoned_Data_24K.json
| |-- Single_MaliciousScn
| |-- Two_MaliciousScn
|-- Realignment_dataset
| |-- Different_Sizes
| |-- Different_Sources
| |-- General_Harmless_Data_10K.json
| |-- Helpful_Data_10K.json
| |-- Re-alignment_Data_20K.json
|-- Evaluation_dataset
| |-- BenignScn_MaliciousScn
| |-- Single_MaliciousScn
| |-- Two_MaliciousScn
Instructional_Data
|-- Poisoned_dataset
|-- Realignment_dataset
|-- Evaluation_dataset
```
|
irds/lotte_recreation_test_forum | ---
pretty_name: '`lotte/recreation/test/forum`'
viewer: false
source_datasets: ['irds/lotte_recreation_test']
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/recreation/test/forum`
The `lotte/recreation/test/forum` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/recreation/test/forum).
# Data
This dataset provides:
- `queries` (i.e., topics); count=2,002
- `qrels`: (relevance assessments); count=6,947
- For `docs`, use [`irds/lotte_recreation_test`](https://huggingface.co/datasets/irds/lotte_recreation_test)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/lotte_recreation_test_forum', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/lotte_recreation_test_forum', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
subset-data/finetune-data-1215cfd29a6d | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 439213.3333333333
num_examples: 56
- name: test
num_bytes: 31372.380952380954
num_examples: 4
- name: valid
num_bytes: 23529.285714285714
num_examples: 3
download_size: 161148
dataset_size: 494115.0
---
# Dataset Card for "finetune-data-1215cfd29a6d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kopyl/mapped-833-pokemon-sdxl-1024 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: prompt_embeds
sequence:
sequence: float32
- name: pooled_prompt_embeds
sequence: float32
- name: model_input
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 869477161.0
num_examples: 833
download_size: 851613359
dataset_size: 869477161.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
orpo-explorers/OpenHermesPreferences-500k | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates_completions
sequence: string
- name: candidate_policies
sequence: string
- name: ranks
sequence: int64
- name: rank_str
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 3670117618.1669345
num_examples: 500000
download_size: 1830624597
dataset_size: 3670117618.1669345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/wikir_ens78k | ---
pretty_name: '`wikir/ens78k`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikir/ens78k`
The `wikir/ens78k` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikir#wikir/ens78k).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=2,456,637
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikir_ens78k', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Frej2020Wikir,
title={WIKIR: A Python toolkit for building a large-scale Wikipedia-based English Information Retrieval Dataset},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={LREC},
year={2020}
}
@inproceedings{Frej2020MlWikir,
title={MLWIKIR: A Python Toolkit for Building Large-scale Wikipedia-based Information Retrieval Datasets in Chinese, English, French, Italian, Japanese, Spanish and More},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={CIRCLE},
year={2020}
}
```
|
Ga88/Clovis5 | ---
license: openrail
---
|
MapleWish/LUNA16_subsets | ---
license: cc
---
|
p1atdev/niji-v5 | ---
license: cc0-1.0
---
私がnijijourney v5で生成した画像。自由に使えます。(けど詐欺とか犯罪につかうのはやめてね)
おすすめの使い方としては、とりあえず中の画像を見てみて好きなものだけ選んで使うとよいと思います。
全体の注意点として、必ずしもすべての画像にキャプションが付属してるとは限らないのと、キャプションがついていてもそのまま使うと問題が発生する可能性があるのであまり信用しないでください。
また人為的なミスにより、4分割されずに結合している画像があったり、過度に分割されている画像があったりするので注意してください。
## vol1
だいたい2000枚くらいで、多分全部デフォルトスタイルのものです。
解答すると中にLAION Aesthetic v2のスコアでいくつかのフォルダに分類されています。`aesthetic_50` ならスコア0.5以上のものです。`not_aesthetic` は0.5未満のものです。
ただし、`exceptional` フォルダはチェリーピックした画像が入っており、`aesthetic_xx` の中のものと重複します。`exclude` フォルダは、主観で奇妙なものを除いたものです。
`aesthetic_xx` と `exceptional` にはキャプション(BLIP2、Tagger)ファイルがついていますが、いろいろ変な調整しているのでおそらく各自でキャプションをつけ直したほうがいいと思います。
## vol2
だいたい1200枚くらいで、複数のスタイルモードで生成したものが含まれます。
手動でスタイルごとにフォルダを分けています。
`default`、`cute`、`expressive`、`scenic` はそれぞれのスタイルっぽい画像で分類していますが、たまに分類を間違えています(ごめん)。
`clear` と `rough painting` は、個人的に近いと思ったスタイルの画像を入れていて、4つのスタイルの画像とは重複しません。
|default|cute|expressive|scenic|clear|rough painting|
|-|-|-|-|-|-|
|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/ROBUltJHEdadypi8JJ7QZ.jpeg" width="200px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/lPpxxFZggOD4QZLgQ03WS.jpeg" width="200px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/E5T2nAxwiYxSORoGov_8e.jpeg" width="200px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/juur651e8PS1TcDVwxITm.jpeg" width="200px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/j9GUce5nsKMVN4z2E14sW.jpeg" width="300px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/w3OrlxnDtFEiDEUe8ey5c.jpeg" width="300px" />|
## vol3
450枚くらいです。キャプションは一切ついていません。
雰囲気で分類しています。
- `background`: 背景のみで人物が写っていないもの
- `comic`: 白黒だったり漫画風なもの(百合が多いので微妙に注意)
- `ink painting`: 水墨画・水彩っぽいもの
- `scenic`: scenic っぽい画像で、人物が写っているものも含む。`background` の画像と一部重複する。
含まれる画像の例
|background|comic|ink painting|scenic|
|-|-|-|-|
|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/ZoH3PCg918_WhoMfKu-JJ.png" width="300px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/i5KLwkPJN0guLgval5aBr.png" width="200px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/MrGOEretLVNjM4ZaO8yPe.png" width="200px" />|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/uHH7rswou_9ZzL1-phbip.png" width="200px" />|
## vol4
現在は48枚です。
- `minimalist`: 非常にシンプルな感じの画風の画像
含まれる画像の例
|minimalist|
|-|
|<img src="https://s3.amazonaws.com/moonup/production/uploads/6305db1fcfbde33ef7d480ff/UCuGYVyvkqS7JmUseKF3c.png" width="200px" />|
|
thestattrak/kevin | ---
license: openrail
---
|
DmitrMakeev/Deforum-file | ---
license: openrail
---
|
CyberHarem/narumiya_yume_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of narumiya_yume/成宮由愛 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of narumiya_yume/成宮由愛 (THE iDOLM@STER: Cinderella Girls), containing 125 images and their tags.
The core tags of this character are `grey_hair, short_hair, mole, mole_under_eye, brown_eyes, bangs, hairband, hair_between_eyes, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 99.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumiya_yume_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 76.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumiya_yume_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 256 | 143.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumiya_yume_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 94.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumiya_yume_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 256 | 171.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/narumiya_yume_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/narumiya_yume_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blush, solo, :d, looking_at_viewer, open_mouth, white_background, simple_background, dress, long_sleeves, hair_bow, shirt, upper_body |
| 1 | 6 |  |  |  |  |  | 1girl, hair_flower, solo, smile, bracelet, dress, looking_at_viewer, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | solo | :d | looking_at_viewer | open_mouth | white_background | simple_background | dress | long_sleeves | hair_bow | shirt | upper_body | hair_flower | smile | bracelet | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-----|:--------------------|:-------------|:-------------------|:--------------------|:--------|:---------------|:-----------|:--------|:-------------|:--------------|:--------|:-----------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | | X | | | | X | | | | | X | X | X | X |
|
open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model | ---
pretty_name: Evaluation run of TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model](https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T09:16:00.873424](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model/blob/main/results_2023-10-29T09-16-00.873424.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.29247063758389263,\n\
\ \"em_stderr\": 0.004658574242541351,\n \"f1\": 0.34158871644295397,\n\
\ \"f1_stderr\": 0.004610159225684241,\n \"acc\": 0.40783419789572956,\n\
\ \"acc_stderr\": 0.009578253696730769\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.29247063758389263,\n \"em_stderr\": 0.004658574242541351,\n\
\ \"f1\": 0.34158871644295397,\n \"f1_stderr\": 0.004610159225684241\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|arc:challenge|25_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T09_16_00.873424
path:
- '**/details_harness|drop|3_2023-10-29T09-16-00.873424.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T09-16-00.873424.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T09_16_00.873424
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-16-00.873424.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-16-00.873424.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hellaswag|10_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_24T06_39_02.499923
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T06:39:02.499923.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T06:39:02.499923.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T09_16_00.873424
path:
- '**/details_harness|winogrande|5_2023-10-29T09-16-00.873424.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T09-16-00.873424.parquet'
- config_name: results
data_files:
- split: 2023_10_29T09_16_00.873424
path:
- results_2023-10-29T09-16-00.873424.parquet
- split: latest
path:
- results_2023-10-29T09-16-00.873424.parquet
---
# Dataset Card for Evaluation run of TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model](https://huggingface.co/TaylorAI/FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T09:16:00.873424](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__FLAN-Llama-7B-2_Llama2-7B-Flash_868_full_model/blob/main/results_2023-10-29T09-16-00.873424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.29247063758389263,
"em_stderr": 0.004658574242541351,
"f1": 0.34158871644295397,
"f1_stderr": 0.004610159225684241,
"acc": 0.40783419789572956,
"acc_stderr": 0.009578253696730769
},
"harness|drop|3": {
"em": 0.29247063758389263,
"em_stderr": 0.004658574242541351,
"f1": 0.34158871644295397,
"f1_stderr": 0.004610159225684241
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
arthurmluz/GPTextSum2_data-wiki_cstnews_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 92922
num_examples: 20
download_size: 89357
dataset_size: 92922
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum2_data-wiki_cstnews_results"
rouge= {'rouge1': 0.40559145209215386, 'rouge2': 0.1858323707445477, 'rougeL': 0.2713738809702273, 'rougeLsum': 0.2713738809702273}
bert= {'precision': 0.7676798492670059, 'recall': 0.7191876947879792, 'f1': 0.7423095703125}
mover = 0.6047207310084797 |
sam1120/parking-utcustom-eval | ---
dataset_info:
features:
- name: name
dtype: string
- name: pixel_values
dtype: image
- name: labels
dtype: image
splits:
- name: train
num_bytes: 79902058.0
num_examples: 29
download_size: 22213204
dataset_size: 79902058.0
---
# Dataset Card for "parking-utcustom-eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/stack-v2-python | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: directory_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_language
dtype: string
- name: files
list:
- name: blob_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: language
dtype: string
- name: length_bytes
dtype: int64
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: src_encoding
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: alphanum_fraction
dtype: float32
- name: alpha_fraction
dtype: float32
- name: num_lines
dtype: int32
- name: avg_line_length
dtype: float32
- name: max_line_length
dtype: int32
- name: num_files
dtype: int64
splits:
- name: train
num_bytes: 20887324838.790043
num_examples: 8954903
download_size: 15102959847
dataset_size: 20887324838.790043
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/hanako_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hanako/浦和ハナコ/花子 (Blue Archive)
This is the dataset of hanako/浦和ハナコ/花子 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `long_hair, pink_hair, halo, ahoge, breasts, green_eyes, pink_halo, braid, bow, large_breasts, hair_bow, white_bow, hair_between_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hanako_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 868.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanako_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1352 | 1.76 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hanako_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hanako_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, cleavage, holding_hose, long_sleeves, looking_at_viewer, official_alternate_costume, solo, white_shirt, bikini_under_clothes, blush, smile, collarbone, collared_shirt, wet_shirt, closed_mouth, see-through, water, white_background, simple_background, thighs, pink_bikini, sitting |
| 1 | 22 |  |  |  |  |  | 1girl, bikini_under_clothes, blue_sky, cleavage, holding_hose, long_sleeves, looking_at_viewer, official_alternate_costume, outdoors, smile, solo, white_shirt, blush, day, water, cloud, pink_bikini, closed_mouth, collarbone, collared_shirt, wet_shirt, see-through, thighs |
| 2 | 6 |  |  |  |  |  | 1girl, bikini_top_only, blush, bottomless, cleavage, collarbone, holding_hose, long_sleeves, looking_at_viewer, navel, official_alternate_costume, open_shirt, smile, solo, stomach, thighs, white_shirt, closed_mouth, collared_shirt, pink_bikini, simple_background, wet, water, white_background, huge_breasts, underboob |
| 3 | 36 |  |  |  |  |  | short_sleeves, 1girl, solo, blush, smile, white_skirt, looking_at_viewer, pleated_skirt, white_shirt, blue_sailor_collar, single_braid, white_serafuku, pink_bow, closed_mouth, simple_background, white_background, open_mouth |
| 4 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, sweat, completely_nude, looking_at_viewer, open_mouth, penis, single_braid, vaginal, collarbone, heavy_breathing, huge_breasts, navel, pussy, smile, thighs, motion_lines, side_braid, stomach, bar_censor, blurry_background, cowgirl_position, girl_on_top, happy_sex, heart, indoors, mosaic_censoring, pov_crotch, speech_bubble, spread_legs |
| 5 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blue_one-piece_swimsuit, blush, competition_school_swimsuit, double-parted_bangs, highleg_swimsuit, looking_at_viewer, side_braid, single_braid, smile, straight_hair, thighs, closed_mouth, from_side, groin, sideboob, solo, black_one-piece_swimsuit, cowboy_shot, standing, arms_behind_back, ass, blue_sky, blurry_background, bright_pupils, chain-link_fence, covered_navel, day, depth_of_field, gradient_hair, outdoors, pool, taut_clothes, wet_swimsuit, white_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | holding_hose | long_sleeves | looking_at_viewer | official_alternate_costume | solo | white_shirt | bikini_under_clothes | blush | smile | collarbone | collared_shirt | wet_shirt | closed_mouth | see-through | water | white_background | simple_background | thighs | pink_bikini | sitting | blue_sky | outdoors | day | cloud | bikini_top_only | bottomless | navel | open_shirt | stomach | wet | huge_breasts | underboob | short_sleeves | white_skirt | pleated_skirt | blue_sailor_collar | single_braid | white_serafuku | pink_bow | open_mouth | 1boy | hetero | nipples | solo_focus | sweat | completely_nude | penis | vaginal | heavy_breathing | pussy | motion_lines | side_braid | bar_censor | blurry_background | cowgirl_position | girl_on_top | happy_sex | heart | indoors | mosaic_censoring | pov_crotch | speech_bubble | spread_legs | bare_shoulders | blue_one-piece_swimsuit | competition_school_swimsuit | double-parted_bangs | highleg_swimsuit | straight_hair | from_side | groin | sideboob | black_one-piece_swimsuit | cowboy_shot | standing | arms_behind_back | ass | bright_pupils | chain-link_fence | covered_navel | depth_of_field | gradient_hair | pool | taut_clothes | wet_swimsuit | white_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:---------------|:---------------|:--------------------|:-----------------------------|:-------|:--------------|:-----------------------|:--------|:--------|:-------------|:-----------------|:------------|:---------------|:--------------|:--------|:-------------------|:--------------------|:---------|:--------------|:----------|:-----------|:-----------|:------|:--------|:------------------|:-------------|:--------|:-------------|:----------|:------|:---------------|:------------|:----------------|:--------------|:----------------|:---------------------|:---------------|:-----------------|:-----------|:-------------|:-------|:---------|:----------|:-------------|:--------|:------------------|:--------|:----------|:------------------|:--------|:---------------|:-------------|:-------------|:--------------------|:-------------------|:--------------|:------------|:--------|:----------|:-------------------|:-------------|:----------------|:--------------|:-----------------|:--------------------------|:------------------------------|:----------------------|:-------------------|:----------------|:------------|:--------|:-----------|:---------------------------|:--------------|:-----------|:-------------------|:------|:----------------|:-------------------|:----------------|:-----------------|:----------------|:-------|:---------------|:---------------|:---------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | X | | X | | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 36 |  |  |  |  |  | X | | | | X | | X | X | | X | X | | | | X | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | | X | | | | | X | X | X | | | | | | | | X | | | | | | | | | X | | X | | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | X | | | X | X | | | | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
AlekseyKorshuk/dalio-book-handwritten-io-sorted | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 674482.0
num_examples: 442
- name: validation
num_bytes: 519665
num_examples: 315
- name: test
num_bytes: 14786
num_examples: 10
download_size: 0
dataset_size: 1208933.0
---
# Dataset Card for "dalio-book-handwritten-io-sorted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mainlp/inconsistencies_companies | ---
license: cc-by-4.0
--- |
Mr-User/many | ---
license: apache-2.0
---
|
echarlaix/vqa | ---
license: apache-2.0
---
|
satanicsmores/IMAGE2IMAGE-SatTANIC-SMores | ---
license: mit
language:
- en
- gl
- is
tags:
- art
- code
- not-for-all-audiences
pretty_name: pretty_name
size_categories:
- n<1K
--- |
amansahanigermany/cartoonizer-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 6497455.0
num_examples: 10
download_size: 6500440
dataset_size: 6497455.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ttxy/kaggle | ---
license: apache-2.0
---
kaggle datasets |
Megnis/ml_talents_hr-llamma2-style | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 7708951
num_examples: 656
download_size: 2288316
dataset_size: 7708951
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NeuraXenetica/managpt-4080-nlp-prompts-and-generated-texts | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- en
pretty_name: 'ManaGPT: 4,080 NLP prompts and generated texts'
size_categories:
- 1K<n<10K
---
This dataset includes 4,080 texts that were generated by the [**ManaGPT-1020**](https://huggingface.co/NeuraXenetica/ManaGPT-1020) large language model, in response to particular input sequences.
ManaGPT-1020 is a free, open-source model available for download and use via Hugging Face’s “transformers” Python package. The model is a 1.5-billion-parameter LLM that’s capable of generating text in order to complete a sentence whose first words have been provided via a user-supplied input sequence. The model represents an elaboration of GPT-2 that has been fine-tuned (using Python and TensorFlow) on a specialized English-language corpus of over 509,000 words from the domain of organizational futures studies. In particular, the model has been trained to generate analysis, predictions, and recommendations regarding the emerging role of advanced AI, social robotics, ubiquitous computing, virtual reality, neurocybernetic augmentation, and other “posthumanizing” technologies in organizational life.
In generating the texts, 102 different prompts were used, each of which was employed to generate 20 responses. The 102 input sequences were created by concatenating 12 different "subjects" with 17 different "modal variants," in every possible combination. The subjects included 6 grammatically singular subjects:
- "The workplace of tomorrow"
- "Technological posthumanization"
- "The organizational use of AI"
- "A robotic boss"
- "An artificially intelligent coworker"
- "Business culture within Society 5.0"
Also included were 6 grammatically plural subjects:
- "Social robots"
- "Hybrid human-robotic organizations"
- "Artificially intelligent businesses"
- "The posthumanized workplaces of the future"
- "Cybernetically augmented workers"
- "Organizations in Society 5.0"
For the 6 grammatically singular subjects, the 17 modal variants included one "blank" variant (an empty string) and 16 phrases that lend the input sequence diverse forms of "modal shading," by indicating varying degrees of certainty, probability, predictability, logical necessity, or moral obligation or approbation. These modal variants were:
- ""
- " is"
- " is not"
- " will"
- " will be"
- " may"
- " might never"
- " is likely to"
- " is unlikely to"
- " should"
- " can"
- " cannot"
- " can never"
- " must"
- " must not"
- " is like"
- " will be like"
The variants used with grammatically plural subjects were identical, apart from the fact that the word “is” was changed to “are,” wherever it appeared.
In a small number of cases (only occurring when the empty string "" was used as part of the input sequence), the model failed to generate any output beyond the input sequence itself. |
kz919/open-orca-flan-50k-synthetic-reward-e5-mistral-7b-instruct-v7 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: task
dtype: string
- name: ignos-Mistral-T5-7B-v1
dtype: string
- name: cognAI-lil-c3po
dtype: string
- name: viethq188-Rabbit-7B-DPO-Chat
dtype: string
- name: cookinai-DonutLM-v1
dtype: string
- name: v1olet-v1olet-merged-dpo-7B
dtype: string
- name: normalized_rewards
sequence: float64
- name: router_label
dtype: int64
splits:
- name: train
num_bytes: 16166616
num_examples: 7271
download_size: 7453727
dataset_size: 16166616
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cakiki/stack-smol-xxl-embeddings | ---
dataset_info:
features:
- name: token_ids
sequence: int64
- name: lri_160
sequence: float64
splits:
- name: train
num_bytes: 231978165104
num_examples: 11658586
download_size: 34909750705
dataset_size: 231978165104
---
# Dataset Card for "stack-smol-xxl-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-emotion-2d469b4f-13675887 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
RustamovPY/test_voices | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: voice
dtype: string
- name: text
dtype: string
- name: speaker
dtype: string
splits:
- name: train
num_bytes: 385
num_examples: 3
download_size: 2746
dataset_size: 385
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_voices"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/dioscuri_pollux_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Dioscuri Pollux (Fate/Grand Order)
This is the dataset of Dioscuri Pollux (Fate/Grand Order), containing 131 images and their tags.
The core tags of this character are `blonde_hair, bangs, breasts, medium_hair, blue_eyes, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 131 | 152.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 131 | 98.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 311 | 202.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 131 | 136.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 311 | 266.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dioscuri_pollux_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dioscuri_pollux_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, armlet, bare_shoulders, diadem, looking_at_viewer, metal_collar, pauldrons, solo, white_robe, halterneck, thighs, black_shirt, bracer, sword, closed_mouth, faulds, simple_background |
| 1 | 6 |  |  |  |  |  | 1girl, armlet, bare_shoulders, blush, bracer, covered_navel, diadem, halterneck, looking_at_viewer, medium_breasts, simple_background, solo, thighs, white_background, metal_collar, purple_eyes, smile, closed_mouth, white_robe, faulds |
| 2 | 6 |  |  |  |  |  | 1girl, armlet, black_shirt, diadem, looking_at_viewer, metal_collar, short_hair, twins, white_robe, 1boy, bare_shoulders, brother_and_sister, simple_background, white_background, pauldrons, halterneck, smile |
| 3 | 17 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, thighs, diadem, large_breasts, open_mouth, collarbone, penis, armlet, bar_censor, pussy, vaginal, bare_shoulders, girl_on_top, nude, purple_eyes, sex_from_behind, smile, speech_bubble, straddling |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armlet | bare_shoulders | diadem | looking_at_viewer | metal_collar | pauldrons | solo | white_robe | halterneck | thighs | black_shirt | bracer | sword | closed_mouth | faulds | simple_background | blush | covered_navel | medium_breasts | white_background | purple_eyes | smile | short_hair | twins | 1boy | brother_and_sister | hetero | nipples | large_breasts | open_mouth | collarbone | penis | bar_censor | pussy | vaginal | girl_on_top | nude | sex_from_behind | speech_bubble | straddling |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-----------------|:---------|:--------------------|:---------------|:------------|:-------|:-------------|:-------------|:---------|:--------------|:---------|:--------|:---------------|:---------|:--------------------|:--------|:----------------|:-----------------|:-------------------|:--------------|:--------|:-------------|:--------|:-------|:---------------------|:---------|:----------|:----------------|:-------------|:-------------|:--------|:-------------|:--------|:----------|:--------------|:-------|:------------------|:----------------|:-------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 17 |  |  |  |  |  | X | X | X | X | | | | | | | X | | | | | | | X | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
HydraLM/GPTeacher-General-Instruct_list_dict | ---
dataset_info:
features:
- name: conversations
list:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 56447404
num_examples: 89259
download_size: 0
dataset_size: 56447404
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPTeacher-General-Instruct_list_dict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
presencesw/cot-collection_v2 | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: rationale
dtype: string
- name: task
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 1427400995.4030628
num_examples: 1271010
download_size: 546101259
dataset_size: 1427400995.4030628
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AntoineBlanot/mnli-3way | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label_name
dtype: string
splits:
- name: train
num_bytes: 75405059
num_examples: 392702
- name: test
num_bytes: 1853683
num_examples: 9815
download_size: 51216284
dataset_size: 77258742
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "mnli-3way"
This dataset is the [multi_nli](https://huggingface.co/datasets/multi_nli) dataset where the labels are: `entailment`, `contradiction` and `neutral`.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/tex_inv_hot_ip_light | ---
dataset_info:
features:
- name: label
dtype: string
- name: tex_inv_hot_ip_prompt_similarity
dtype: float32
- name: tex_inv_hot_ip_identity_consistency
dtype: float32
- name: tex_inv_hot_ip_negative_prompt_similarity
dtype: float32
- name: tex_inv_hot_ip_target_prompt_similarity
dtype: float32
- name: tex_inv_hot_ip_aesthetic_score
dtype: float32
splits:
- name: train
num_bytes: 308
num_examples: 11
download_size: 4221
dataset_size: 308
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amttl | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- zh
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- parsing
pretty_name: AMTTL
dataset_info:
config_name: amttl
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: tags
sequence:
class_label:
names:
'0': B
'1': I
'2': E
'3': S
splits:
- name: train
num_bytes: 1132196
num_examples: 3063
- name: validation
num_bytes: 324358
num_examples: 822
- name: test
num_bytes: 328509
num_examples: 908
download_size: 274351
dataset_size: 1785063
configs:
- config_name: amttl
data_files:
- split: train
path: amttl/train-*
- split: validation
path: amttl/validation-*
- split: test
path: amttl/test-*
default: true
---
# Dataset Card for AMTTL
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/adapt-sjtu/AMTTL/tree/master/medical_data)
- **Repository:** [Github](https://github.com/adapt-sjtu/AMTTL/tree/master/medical_data)
- **Paper:** [Aclweb](http://aclweb.org/anthology/C18-1307)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@inproceedings{xing2018adaptive,
title={Adaptive multi-task transfer learning for Chinese word segmentation in medical text},
author={Xing, Junjie and Zhu, Kenny and Zhang, Shaodian},
booktitle={Proceedings of the 27th International Conference on Computational Linguistics},
pages={3619--3630},
year={2018}
}
```
### Contributions
Thanks to [@JetRunner](https://github.com/JetRunner) for adding this dataset. |
leoozyhg/12312123 | ---
license: openrail
---
|
GAIR/preference-dissection | ---
dataset_info:
features:
- name: query
dtype: string
- name: scenario_auto-j
dtype: string
- name: scenario_group
dtype: string
- name: response_1
struct:
- name: content
dtype: string
- name: model
dtype: string
- name: num_words
dtype: int64
- name: response_2
struct:
- name: content
dtype: string
- name: model
dtype: string
- name: num_words
dtype: int64
- name: gpt-4-turbo_reference
dtype: string
- name: clear intent
dtype: string
- name: explicitly express feelings
dtype: string
- name: explicit constraints
sequence: string
- name: explicit subjective stances
sequence: string
- name: explicit mistakes or biases
sequence: string
- name: preference_labels
struct:
- name: gpt-3.5-turbo-1106
dtype: string
- name: gpt-4-1106-preview
dtype: string
- name: human
dtype: string
- name: llama-2-13b
dtype: string
- name: llama-2-13b-chat
dtype: string
- name: llama-2-70b
dtype: string
- name: llama-2-70b-chat
dtype: string
- name: llama-2-7b
dtype: string
- name: llama-2-7b-chat
dtype: string
- name: mistral-7b
dtype: string
- name: mistral-7b-instruct-v0.1
dtype: string
- name: mistral-7b-instruct-v0.2
dtype: string
- name: mistral-8x7b
dtype: string
- name: mistral-8x7b-instruct-v0.1
dtype: string
- name: qwen-14b
dtype: string
- name: qwen-14b-chat
dtype: string
- name: qwen-72b
dtype: string
- name: qwen-72b-chat
dtype: string
- name: qwen-7b
dtype: string
- name: qwen-7b-chat
dtype: string
- name: tulu-2-dpo-13b
dtype: string
- name: tulu-2-dpo-70b
dtype: string
- name: tulu-2-dpo-7b
dtype: string
- name: vicuna-13b-v1.5
dtype: string
- name: vicuna-7b-v1.5
dtype: string
- name: wizardLM-13b-v1.2
dtype: string
- name: wizardLM-70b-v1.0
dtype: string
- name: yi-34b
dtype: string
- name: yi-34b-chat
dtype: string
- name: yi-6b
dtype: string
- name: yi-6b-chat
dtype: string
- name: zephyr-7b-alpha
dtype: string
- name: zephyr-7b-beta
dtype: string
- name: basic_response_1
struct:
- name: admit limitations or mistakes
dtype: int64
- name: authoritative tone
dtype: int64
- name: clear and understandable
dtype: int64
- name: complex word usage and sentence structure
dtype: int64
- name: friendly
dtype: int64
- name: funny and humorous
dtype: int64
- name: grammar, spelling, punctuation, and code-switching
dtype: int64
- name: harmlessness
dtype: int64
- name: information richness without considering inaccuracy
dtype: int64
- name: innovative and novel
dtype: int64
- name: interactive
dtype: int64
- name: metaphors, personification, similes, hyperboles, irony, parallelism
dtype: int64
- name: persuade user
dtype: int64
- name: polite
dtype: int64
- name: relevance without considering inaccuracy
dtype: int64
- name: repetitive
dtype: int64
- name: step by step solution
dtype: int64
- name: use of direct and explicit supporting materials
dtype: int64
- name: use of informal expressions
dtype: int64
- name: well formatted
dtype: int64
- name: basic_response_2
struct:
- name: admit limitations or mistakes
dtype: int64
- name: authoritative tone
dtype: int64
- name: clear and understandable
dtype: int64
- name: complex word usage and sentence structure
dtype: int64
- name: friendly
dtype: int64
- name: funny and humorous
dtype: int64
- name: grammar, spelling, punctuation, and code-switching
dtype: int64
- name: harmlessness
dtype: int64
- name: information richness without considering inaccuracy
dtype: int64
- name: innovative and novel
dtype: int64
- name: interactive
dtype: int64
- name: metaphors, personification, similes, hyperboles, irony, parallelism
dtype: int64
- name: persuade user
dtype: int64
- name: polite
dtype: int64
- name: relevance without considering inaccuracy
dtype: int64
- name: repetitive
dtype: int64
- name: step by step solution
dtype: int64
- name: use of direct and explicit supporting materials
dtype: int64
- name: use of informal expressions
dtype: int64
- name: well formatted
dtype: int64
- name: errors_response_1
struct:
- name: applicable or not
dtype: string
- name: errors
list:
- name: brief description
dtype: string
- name: severity
dtype: string
- name: type
dtype: string
- name: errors_response_2
struct:
- name: applicable or not
dtype: string
- name: errors
list:
- name: brief description
dtype: string
- name: severity
dtype: string
- name: type
dtype: string
- name: query-specific_response_1
struct:
- name: clarify user intent
dtype: int64
- name: correcting explicit mistakes or biases
sequence: string
- name: satisfying explicit constraints
sequence: string
- name: showing empathetic
dtype: int64
- name: supporting explicit subjective stances
sequence: string
- name: query-specific_response_2
struct:
- name: clarify user intent
dtype: int64
- name: correcting explicit mistakes or biases
sequence: string
- name: satisfying explicit constraints
sequence: string
- name: showing empathetic
dtype: int64
- name: supporting explicit subjective stances
sequence: string
splits:
- name: train
num_bytes: 27617371
num_examples: 5240
download_size: 13124269
dataset_size: 27617371
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
pretty_name: Preference Dissection
license: cc-by-nc-4.0
---
## Introduction
We release the annotated data used in [Dissecting Human and LLM Preferences](https://arxiv.org/abs/2402.11296).
*Original Dataset* - The dataset is based on [lmsys/chatbot_arena_conversations](https://huggingface.co/datasets/lmsys/chatbot_arena_conversations), which contains 33K cleaned conversations with pairwise human preferences collected from 13K unique IP addresses on the [Chatbot Arena](https://lmsys.org/blog/2023-05-03-arena/) from April to June 2023.
*Filtering and Scenario-wise Sampling* - We filter out the conversations that are not in English, with "Tie" or "Both Bad" labels, and the multi-turn conversations. We first sample 400 samples with unsafe queries according to the OpenAI moderation API tags and the additional toxic tags in the original dataset, then we apply [Auto-J's scenario classifier](https://huggingface.co/GAIR/autoj-scenario-classifier) to determine the scenario of each sample (we merge the Auto-J's scenarios into 10 new ones). For the *Knowledge-aware* and *Others* scenarios, we pick 820 samples, and for the other scenarios, we pick 400 samples. The total number is 5,240.
*Collecting Preferences* - Besides the human preference labels in this original dataset, we also collect the binary preference labels from 32 LLMs, including 2 proprietary LLMs and 30 open-source ones.
*Annotation on Defined Properties* - We define a set of 29 properties, we annotate how each property is satisfied (in Likert scale rating or property-specific annotation) in all responses ($5,240\times 2=10,480$). See our paper for more details of the defined properties.
## Dataset Overview
An example of the json format is as follows:
```json
{
"query": "...",
"scenario_auto-j": "...",
"scenario_group": "...",
"response_1": {
"content": "...",
"model": "...",
"num_words": "..."
},
"response_2": {...},
"gpt-4-turbo_reference": "...",
"clear intent": "Yes/No",
"explicitly express feelings": "Yes/No",
"explicit constraints": [
...
],
"explicit subjective stances": [
...
],
"explicit mistakes or biases": [
...
],
"preference_labels": {
"human": "response_1/response_2",
"gpt-4-turbo": "response_1/response_2",
...
},
"basic_response_1": {
"admit limitations or mistakes": 0/1/2/3,
"authoritative tone": 0/1/2/3,
...
},
"basic_response_2": {...},
"errors_response_1": {
"applicable or not": "applicable/not applicable",
"errors":[
{
"brief description": "...",
"severity": "severe/moderate/minor",
"type": "...",
},
...
]
},
"errors_response_2": {...},
"query-specific_response_1": {
"clarify user intent": ...,
"correcting explicit mistakes or biases": None,
"satisfying explicit constraints": [
...
],
"showing empathetic": [
...
],
"supporting explicit subjective stances": [
...
]
},
"query-specific_response_2": {...}
}
```
The following fields are basic information:
- **query**: The user query.
- **scenario_auto-j**: The scenario classified by Auto-J's classifier.
- **scenario_group**: One of the 10 new scenarios we merged from the Auto-J's scenario, including an *Unsafe Query* scenario.
- **response_1/response_2**: The content of a response:
- **content**: The text content.
- **model**: The model that generate this response.
- **num_words**: The number of words of this response, determined by NLTK.
- **gpt-4-turbo_reference**: An reference response generated by GPT-4-Turbo.
The following fields are Query-Specific prerequisites. For the last three, there may be an empty list if there is no constraints/stances/mistakes.
- **clear intent**: Whether the intent of the user is clearly expressed in the query, "Yes" or "No".
- **explicitly express feelings**: Whether the user clearly express his/her feelings or emotions in the query, "Yes" or "No".
- **explicit constraints**": A list containing all the explicit constraints in the query.
- **explicit subjective stances**: A list containing all the subjective stances in the query.
- **explicit mistakes or biases**: A list containing all the mistakes or biases in the query.
The following fields are the main body of the annotation.
- **preference_labels**: The preference label for each judge (human or an LLM) indicating which response is preferred in a pair, "response_1/response_2".
- **basic_response_1/basic_response_2**: The annotated ratings of the 20 basic properties (except *lengthy*) for the response.
- **property_name**: 0/1/2/3
- ...
- **errors_response_1/errors_response_2**: The detected errors of the response.
- **applicable or not**: If GPT-4-Turbo find itself can reliably detect the errors in the response.
- **errors**: A list containing the detected errors in the response.
- **brief description**: A brief description of the error.
- **severity**: How much the error affect the overall correctness of the response, "severe/moderate/minor".
- **type**: The type of the error, "factual error/information contradiction to the query/math operation error/code generation error"
- **query-specific_response_1/query-specific_response_2**: The annotation results of the Query-Specific properties.
- **clarify user intent**: If the user intent is not clear, rate how much the response help clarify the intent, 0/1/2/3.
- **showing empathetic**: If the user expresses feelings or emotions, rate how much the response show empathetic, 0/1/2/3.
- **satisfying explicit constraints**: If there are explicit constraints in the query, rate how much the response satisfy each of them.
- A list of "{description of constraint} | 0/1/2/3"
- **correcting explicit mistakes or biases**: If there are mistakes of biases in the query, classify how the response correct each of them
- A list of "{description of mistake} | Pointed out and corrected/Pointed out but not corrected/Corrected without being pointed out/Neither pointed out nor corrected"
- **supporting explicit subjective stances**: If there are subject stances in the query, classify how the response support each of them
- A list of "{description of stance} | Strongly supported/Weakly supported/Neutral/Weakly opposed/Strongly opposed"
## Statistics
👇 Number of samples meeting 5 Query-specific prerequisites.
| Prerequisite | # | Prerequisite | # |
| ------------------------- | ----- | ---------------- | ---- |
| with explicit constraints | 1,418 | unclear intent | 459 |
| show subjective stances | 388 | express feelings | 121 |
| contain mistakes or bias | 401 | | |
👇 Mean Score/Count for each property in collected data. *The average scores of 5 query-specific properties are calculated only on samples where the queries met specific prerequisites.
| Property | Mean Score/Count | Property | Mean Score/Count |
| ---------------------------- | ---------------- | ---------------------------- | ---------------- |
| **Mean Score** | |
| harmless | 2.90 | persuasive | 0.27 |
| grammarly correct | 2.70 | step-by-step | 0.37 |
| friendly | 1.79 | use informal expressions | 0.04 |
| polite | 2.78 | clear | 2.54 |
| interactive | 0.22 | contain rich information | 1.74 |
| authoritative | 1.67 | novel | 0.47 |
| funny | 0.08 | relevant | 2.45 |
| use rhetorical devices | 0.16 | clarify intent* | 1.33 |
| complex word & sentence | 0.89 | show empathetic* | 1.48 |
| use supporting materials | 0.13 | satisfy constraints* | 2.01 |
| well formatted | 1.26 | support stances* | 2.28 |
| admit limits | 0.17 | correct mistakes* | 1.08 |
| **Mean Count** | |
| severe errors | 0.59 | minor errors | 0.23 |
| moderate errors | 0.61 | length | 164.52 |
👇 Property correlation in the annotated data.
<img src="./property_corr.PNG" alt="image-20240213145030747" style="zoom: 50%;" />
## Disclaimers and Terms
**This part is copied from the original dataset*
- **This dataset contains conversations that may be considered unsafe, offensive, or upsetting.** It is not intended for training dialogue agents without applying appropriate filtering measures. We are not responsible for any outputs of the models trained on this dataset.
- Statements or opinions made in this dataset do not reflect the views of researchers or institutions involved in the data collection effort.
- Users of this data are responsible for ensuring its appropriate use, which includes abiding by any applicable laws and regulations.
- Users of this data should adhere to the terms of use for a specific model when using its direct outputs.
- Users of this data agree to not attempt to determine the identity of individuals in this dataset.
## License
Following the original dataset, this dataset is licensed under CC-BY-NC-4.0.
## Citation
```
@article{li2024dissecting,
title={Dissecting Human and LLM Preferences},
author={Li, Junlong and Zhou, Fan and Sun, Shichao and Zhang, Yikai and Zhao, Hai and Liu, Pengfei},
journal={arXiv preprint arXiv:2402.11296},
year={2024}
}
``` |
the-french-artist/wikipedia_20220301.simple_sentence_embeddings | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: sentence_index
dtype: int64
- name: line_index
dtype: int64
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 1061056337
num_examples: 585455
download_size: 1331697726
dataset_size: 1061056337
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
YUiCHl/scale0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 474801995.0
num_examples: 1588
download_size: 472199271
dataset_size: 474801995.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hopee4/cariuchas | ---
license: openrail
---
|
gagan3012/arabictext | ---
dataset_info:
features:
- name: text
dtype: string
- name: check_char_repetition_criteria
dtype: float64
- name: check_flagged_words_criteria
dtype: float64
- name: meta_data
dtype: string
- name: __id__
dtype: int64
- name: duplicate
dtype: bool
- name: char
dtype: int64
- name: Arabic_char
dtype: int64
- name: latin_char
dtype: int64
- name: numbers_char
dtype: int64
- name: puc_char
dtype: int64
splits:
- name: train
num_bytes: 471530
num_examples: 1000
download_size: 231211
dataset_size: 471530
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "arabictext"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/autotrain-data-ner-test | Invalid username or password. |
cyzgab/singlish-to-english-synthetic | ---
license: cc-by-nc-sa-4.0
task_categories:
- translation
language:
- en
pretty_name: Singlish to English 🇸🇬
size_categories:
- n<1K
---
# Singlish to English 🇸🇬
> Singapore is known for its efficiency and Singlish is no different - it's colourful and snappy. - [Tessa Wong, BBC News, 2015](https://www.bbc.com/news/magazine-33809914)
This is a synthetic dataset generated by GPT-4.
Each json pair contains one Singlish sentence about an everyday activity (e.g. cooking) and its English translation.
# Sample entry
```json
singlish: "Eh, chop the garlic - you can a not?",
english: Hey, do you know how to chop the garlic?"
```
# Data Generation Code
```python
import json
import pandas as pd
from openai import OpenAI
client = OpenAI()
NUM_SAMPLE = 10
ACTIVITIES = ['cooking',
'studying',
'sleeping',
'eating',
'working',
'exercising',
'reading',
'cleaning',
'shopping',
'driving',
'walking',
'bathing',
'going to work',
'listening to music',
'watching TV',
'playing video games',
'using a computer',
'texting',
'socializing',
'meditating',
'commuting',
'doing laundry',
'ironing clothes',
'dusting',
'vacuuming',
'painting',
'drawing',
'grocery shopping',
'sewing',
'taking a nap',
'jogging',
'biking',
'swimming',
'playing sports',
'checking emails',
'playing with children',
'watching movies',
'playing board games',
'attending school or classes',
'going to the gym',
'playing a musical instrument',
'singing',
'dancing',
'writing',
'photography',
'traveling',
'visiting friends',
'attending events',
'volunteering',
'attending meetings']
dataset = {}
for index, activity in enumerate(ACTIVITIES):
print(index, activity)
response = client.chat.completions.create(
model="gpt-4-1106-preview",
messages=[{"role": "system",
"content": "You are an expert in translating Singlish to English"},
{"role": "user",
"content": f"Create {NUM_SAMPLE} random Singlish (s) to English (e) translation pairs in json. Write full sentences about {activity}."\
f"Don't exaggerate the use of Singlish, and be natural, as how a real Singaporean would speak."\
f"Start the keys from {(index*NUM_SAMPLE)+1}. For example,"\
"{'X':{'s': 'aiyo, why like that', 'e': 'oh my, how did this happen'}"\
"..., 'X+5': {'s': 'don't play play', 'e': 'don't fool around'} }"}],
temperature=0.01,
response_format={"type":"json_object"}
)
output = response.choices[0].message.content
output_json = json.loads(output)
dataset.update(output_json)
# Save the current state of the combined dictionary
with open('singlish_to_english_v0.1.json', 'w') as f:
json.dump(dataset, f, indent=None)
# Convert to tabular csv
df = pd.read_json("singlish_to_english_v0.1.json")
df = df.T
df = df.reset_index()
df.columns = ["index", "singlish", "english"]
df.to_csv("singlish_to_english_v0.1.csv", index=False)
``` |
open-llm-leaderboard/details_Replete-AI__Mistral-Evolved-11b-v0.1 | ---
pretty_name: Evaluation run of Replete-AI/Mistral-Evolved-11b-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Replete-AI/Mistral-Evolved-11b-v0.1](https://huggingface.co/Replete-AI/Mistral-Evolved-11b-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Replete-AI__Mistral-Evolved-11b-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T15:15:50.568647](https://huggingface.co/datasets/open-llm-leaderboard/details_Replete-AI__Mistral-Evolved-11b-v0.1/blob/main/results_2024-03-21T15-15-50.568647.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6309829529704917,\n\
\ \"acc_stderr\": 0.03249502226073932,\n \"acc_norm\": 0.6345615860197364,\n\
\ \"acc_norm_stderr\": 0.033137530512338635,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5923114451952954,\n\
\ \"mc2_stderr\": 0.016045963776594944\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719867,\n\
\ \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.014169664520303101\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6646086436964748,\n\
\ \"acc_stderr\": 0.004711622011148463,\n \"acc_norm\": 0.8465445130452102,\n\
\ \"acc_norm_stderr\": 0.0035968938961909113\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398176,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398176\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298901,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298901\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437395,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437395\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159274,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159274\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077816,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077816\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296417,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296417\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20446927374301677,\n\
\ \"acc_stderr\": 0.01348881340471193,\n \"acc_norm\": 0.20446927374301677,\n\
\ \"acc_norm_stderr\": 0.01348881340471193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694905,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694905\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.01945076843250552,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.01945076843250552\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078684,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078684\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.5923114451952954,\n\
\ \"mc2_stderr\": 0.016045963776594944\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4981046247156937,\n \
\ \"acc_stderr\": 0.013772385765569753\n }\n}\n```"
repo_url: https://huggingface.co/Replete-AI/Mistral-Evolved-11b-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|arc:challenge|25_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|gsm8k|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hellaswag|10_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-15-50.568647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T15-15-50.568647.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- '**/details_harness|winogrande|5_2024-03-21T15-15-50.568647.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T15-15-50.568647.parquet'
- config_name: results
data_files:
- split: 2024_03_21T15_15_50.568647
path:
- results_2024-03-21T15-15-50.568647.parquet
- split: latest
path:
- results_2024-03-21T15-15-50.568647.parquet
---
# Dataset Card for Evaluation run of Replete-AI/Mistral-Evolved-11b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Replete-AI/Mistral-Evolved-11b-v0.1](https://huggingface.co/Replete-AI/Mistral-Evolved-11b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Replete-AI__Mistral-Evolved-11b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T15:15:50.568647](https://huggingface.co/datasets/open-llm-leaderboard/details_Replete-AI__Mistral-Evolved-11b-v0.1/blob/main/results_2024-03-21T15-15-50.568647.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6309829529704917,
"acc_stderr": 0.03249502226073932,
"acc_norm": 0.6345615860197364,
"acc_norm_stderr": 0.033137530512338635,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.5923114451952954,
"mc2_stderr": 0.016045963776594944
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719867,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.014169664520303101
},
"harness|hellaswag|10": {
"acc": 0.6646086436964748,
"acc_stderr": 0.004711622011148463,
"acc_norm": 0.8465445130452102,
"acc_norm_stderr": 0.0035968938961909113
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398176,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398176
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437395,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437395
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159274,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159274
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077816,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077816
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296417,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296417
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.01348881340471193,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.01348881340471193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694905,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694905
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.01945076843250552,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.01945076843250552
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078684,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078684
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.5923114451952954,
"mc2_stderr": 0.016045963776594944
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174789
},
"harness|gsm8k|5": {
"acc": 0.4981046247156937,
"acc_stderr": 0.013772385765569753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kavyamanohar/Malayalam-word-freq | ---
license: cc-by-4.0
---
## Word Frequency Profile of Malayalam
The repo contains Malayalam words and their frequencies as obtained from AI4Bharat [Indic NLP corpus](https://github.com/AI4Bharat/indicnlp_corpus).
There is an associated python script to plot the word frequnecy profile.
|
hemantk089/llama2_7b_fine_tuning_complete_dataset_v7 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 301722
num_examples: 813
- name: test
num_bytes: 72617
num_examples: 204
download_size: 107905
dataset_size: 374339
---
# Dataset Card for "llama2_7b_fine_tuning_complete_dataset_v7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/koshigaya_natsumi_nonnonbiyori | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Koshigaya Natsumi
This is the dataset of Koshigaya Natsumi, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 737 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 824 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 737 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 737 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 604 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 824 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 824 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Code-Refinement/utf_20_refs | ---
dataset_info:
features:
- name: problem_id
dtype: string
- name: attempt_id
dtype: int64
- name: generated_solution
dtype: string
- name: original_reward
dtype: float64
- name: chosen_ref_id
dtype: int64
- name: chosen_refinement
dtype: string
- name: chosen_reward
dtype: float64
- name: rejected_ref_id
dtype: int64
- name: rejected_refinement
dtype: string
- name: rejected_reward
dtype: float64
splits:
- name: train
num_bytes: 439039919
num_examples: 303459
- name: test
num_bytes: 116366494
num_examples: 68155
download_size: 17812330
dataset_size: 555406413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.