datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
chathuranga-jayanath/context-5-predict-token-for-fine-tune-without-comments-from-maven-doxia-1.0-2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: filepath
dtype: string
- name: start_bug_line
dtype: int64
- name: end_bug_line
dtype: int64
- name: bug
dtype: string
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 215985
num_examples: 305
- name: validation
num_bytes: 26311
num_examples: 37
- name: test
num_bytes: 26596
num_examples: 37
download_size: 68312
dataset_size: 268892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
aakanksha19/pico_bigbio_processed | ---
license: unknown
---
|
knaranje/knaranje-dataset-1 | ---
license: unlicense
---
|
zishuod/pokemon-icons | ---
annotations_creators: []
language: []
language_creators: []
license:
- mit
multilinguality: []
pretty_name: pokemon-icons
size_categories: []
source_datasets: []
tags:
- pokemon
task_categories:
- image-classification
task_ids: []
---
# Dataset Card for pokemon-icons
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Pokemon Icons. Most of them are collected and cropped from screenshots captured in Pokémon Sword and Shield.
### Supported Tasks and Leaderboards
Image classification |
derek-thomas/squad-v1.1-t5-question-generation | ---
dataset_info:
features:
- name: context
dtype: string
- name: questions
dtype: string
splits:
- name: train
num_bytes: 20293805
num_examples: 18896
- name: validation
num_bytes: 2376313
num_examples: 2067
download_size: 12600387
dataset_size: 22670118
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- crowdsourced
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Question Generation for T5 based on Squad V1.1
size_categories:
- 10K<n<100K
source_datasets:
- extended|squad
tags:
- questiongeneration
- question-generation
- text2text-generation
task_categories:
- text2text-generation
task_ids: []
---
# Dataset Card for "squad-v1.1-t5-question-generation"
## Dataset Description
- **Homepage:** [https://rajpurkar.github.io/SQuAD-explorer/](https://rajpurkar.github.io/SQuAD-explorer/)
- **Paper:** [SQuAD: 100,000+ Questions for Machine Comprehension of Text](https://arxiv.org/abs/1606.05250)
### Dataset Summary
This is a modified Stanford Question Answering Dataset (SQuAD) to suit question generation with All Questions in One Line (AQOL) just like in [Transformer-based End-to-End Question Generation](https://arxiv.org/pdf/2005.01107v1.pdf)
specifically for the T5 family of models. The prefix is `generate questions: ` so that the task can be unique to a trained model.
Check out the generation notebook [here](https://nbviewer.org/urls/huggingface.co/datasets/derek-thomas/squad-v1.1-t5-question-generation/resolve/main/Squad_V1_Question_Generation.ipynb).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
## Dataset Structure
### Data Instances
#### plain_text
An example of 'train' looks as follows.
```
{
"context": "generate questions: This is a test context.",
"question": "Is this a test? {sep_token} Is this another Test {sep_token}"
}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `context`: a `string` feature.
- `question`: a `string` feature.
### Data Splits
| name |train|validation|
|----------|----:|---------:|
|plain_text|18896| 2067|
### Citation Information
```
@article{2016arXiv160605250R,
author = {{Rajpurkar}, Pranav and {Zhang}, Jian and {Lopyrev},
Konstantin and {Liang}, Percy},
title = "{SQuAD: 100,000+ Questions for Machine Comprehension of Text}",
journal = {arXiv e-prints},
year = 2016,
eid = {arXiv:1606.05250},
pages = {arXiv:1606.05250},
archivePrefix = {arXiv},
eprint = {1606.05250},
}
```
### Contributions
Thanks to [Derek Thomas](https://huggingface.co/derek-thomas) and [Thomas Simonini](https://huggingface.co/ThomasSimonini) for adding this to the hub
Check out: [How to contribute more](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Visitors
[](https://visitorbadge.io/status?path=https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fderek-thomas%2Fsquad-v1.1-t5-question-generation) |
open-llm-leaderboard/details_TheSkullery__Aurora-V2-DLEC | ---
pretty_name: Evaluation run of TheSkullery/Aurora-V2-DLEC
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheSkullery/Aurora-V2-DLEC](https://huggingface.co/TheSkullery/Aurora-V2-DLEC)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheSkullery__Aurora-V2-DLEC\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:52:05.265912](https://huggingface.co/datasets/open-llm-leaderboard/details_TheSkullery__Aurora-V2-DLEC/blob/main/results_2024-03-29T20-52-05.265912.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5205667326482667,\n\
\ \"acc_stderr\": 0.03368608816419543,\n \"acc_norm\": 0.5287817619160767,\n\
\ \"acc_norm_stderr\": 0.03446557212850961,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.5198758467703095,\n\
\ \"mc2_stderr\": 0.01650654731248136\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4197952218430034,\n \"acc_stderr\": 0.014422181226303026,\n\
\ \"acc_norm\": 0.47696245733788395,\n \"acc_norm_stderr\": 0.014595873205358269\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5127464648476399,\n\
\ \"acc_stderr\": 0.00498815974474251,\n \"acc_norm\": 0.694582752439753,\n\
\ \"acc_norm_stderr\": 0.00459642622000091\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777473,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777473\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307695,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n\
\ \"acc_stderr\": 0.027791878753132274,\n \"acc_norm\": 0.6064516129032258,\n\
\ \"acc_norm_stderr\": 0.027791878753132274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533085,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533085\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.031618779179354115,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.031618779179354115\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566197,\n \
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566197\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.01877605231961963,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.01877605231961963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402544,\n \"\
acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402544\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \
\ \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591204,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591204\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924076,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924076\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7164750957854407,\n\
\ \"acc_stderr\": 0.016117318166832272,\n \"acc_norm\": 0.7164750957854407,\n\
\ \"acc_norm_stderr\": 0.016117318166832272\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016134,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016134\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n\
\ \"acc_stderr\": 0.01556639263005703,\n \"acc_norm\": 0.31731843575418994,\n\
\ \"acc_norm_stderr\": 0.01556639263005703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.02746661021314013,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.02746661021314013\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722414998,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722414998\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3376792698826597,\n\
\ \"acc_stderr\": 0.01207856377714556,\n \"acc_norm\": 0.3376792698826597,\n\
\ \"acc_norm_stderr\": 0.01207856377714556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.01584631510139481,\n \"mc2\": 0.5198758467703095,\n\
\ \"mc2_stderr\": 0.01650654731248136\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6961325966850829,\n \"acc_stderr\": 0.012926209475483586\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09931766489764973,\n \
\ \"acc_stderr\": 0.008238371412683958\n }\n}\n```"
repo_url: https://huggingface.co/TheSkullery/Aurora-V2-DLEC
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-05.265912.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-52-05.265912.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- '**/details_harness|winogrande|5_2024-03-29T20-52-05.265912.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-52-05.265912.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_52_05.265912
path:
- results_2024-03-29T20-52-05.265912.parquet
- split: latest
path:
- results_2024-03-29T20-52-05.265912.parquet
---
# Dataset Card for Evaluation run of TheSkullery/Aurora-V2-DLEC
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TheSkullery/Aurora-V2-DLEC](https://huggingface.co/TheSkullery/Aurora-V2-DLEC) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheSkullery__Aurora-V2-DLEC",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:52:05.265912](https://huggingface.co/datasets/open-llm-leaderboard/details_TheSkullery__Aurora-V2-DLEC/blob/main/results_2024-03-29T20-52-05.265912.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5205667326482667,
"acc_stderr": 0.03368608816419543,
"acc_norm": 0.5287817619160767,
"acc_norm_stderr": 0.03446557212850961,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.01584631510139481,
"mc2": 0.5198758467703095,
"mc2_stderr": 0.01650654731248136
},
"harness|arc:challenge|25": {
"acc": 0.4197952218430034,
"acc_stderr": 0.014422181226303026,
"acc_norm": 0.47696245733788395,
"acc_norm_stderr": 0.014595873205358269
},
"harness|hellaswag|10": {
"acc": 0.5127464648476399,
"acc_stderr": 0.00498815974474251,
"acc_norm": 0.694582752439753,
"acc_norm_stderr": 0.00459642622000091
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777473,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777473
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307695,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533085,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533085
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566197,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566197
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.01877605231961963,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.01877605231961963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402544,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402544
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924076,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924076
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560403,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7164750957854407,
"acc_stderr": 0.016117318166832272,
"acc_norm": 0.7164750957854407,
"acc_norm_stderr": 0.016117318166832272
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016134,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.01556639263005703,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.01556639263005703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.02746661021314013,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.02746661021314013
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722414998,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722414998
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3376792698826597,
"acc_stderr": 0.01207856377714556,
"acc_norm": 0.3376792698826597,
"acc_norm_stderr": 0.01207856377714556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.01584631510139481,
"mc2": 0.5198758467703095,
"mc2_stderr": 0.01650654731248136
},
"harness|winogrande|5": {
"acc": 0.6961325966850829,
"acc_stderr": 0.012926209475483586
},
"harness|gsm8k|5": {
"acc": 0.09931766489764973,
"acc_stderr": 0.008238371412683958
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
asas-ai/tydiqa-ar | ---
language:
- ar
license: apache-2.0
task_categories:
- question-answering
pretty_name: tydiqa-ar
configs:
- config_name: primary_task
data_files:
- split: train
path: primary_task/train-*
- split: validation
path: primary_task/validation-*
- config_name: secondary_task
data_files:
- split: train
path: secondary_task/train-*
- split: validation
path: secondary_task/validation-*
dataset_info:
- config_name: primary_task
features:
- name: passage_answer_candidates
sequence:
- name: plaintext_start_byte
dtype: int32
- name: plaintext_end_byte
dtype: int32
- name: question_text
dtype: string
- name: document_title
dtype: string
- name: language
dtype: string
- name: annotations
sequence:
- name: passage_answer_candidate_index
dtype: int32
- name: minimal_answers_start_byte
dtype: int32
- name: minimal_answers_end_byte
dtype: int32
- name: yes_no_answer
dtype: string
- name: document_plaintext
dtype: string
- name: document_url
dtype: string
splits:
- name: train
num_bytes: 767894331.3564428
num_examples: 23092
- name: validation
num_bytes: 35803153.66148902
num_examples: 1380
download_size: 0
dataset_size: 803697485.0179318
- config_name: secondary_task
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 15715443.835027365
num_examples: 14805
- name: validation
num_bytes: 908198.6986409297
num_examples: 921
download_size: 0
dataset_size: 16623642.533668295
---
# Dataset Card for "tydiqa-ar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vincent-luo/hagrid-mediapipe-hands | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 111989279184.95
num_examples: 507050
download_size: 112032639870
dataset_size: 111989279184.95
---
# Dataset Card for "hagrid-mediapipe-hands"
This dataset is designed to train a ControlNet with human hands. It includes hand landmarks detected by MediaPipe(for more information refer to: https://developers.google.com/mediapipe/solutions/vision/hand_landmarker).
The source image data is from [HaGRID dataset](https://github.com/hukenovs/hagrid) and we use a modified version from Kaggle(https://www.kaggle.com/datasets/innominate817/hagrid-classification-512p) to build this dataset. There are 507050 data samples in total and the image resolution is 512x512.
### Generate Mediapipe annotation
We use the script below to generate hand landmarks and you should download `hand_landmarker.task` file first. For more information please refer to [this](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker).
```
import mediapipe as mp
from mediapipe import solutions
from mediapipe.framework.formats import landmark_pb2
from mediapipe.tasks import python
from mediapipe.tasks.python import vision
from PIL import Image
import cv2
import numpy as np
def draw_landmarks_on_image(rgb_image, detection_result):
hand_landmarks_list = detection_result.hand_landmarks
handedness_list = detection_result.handedness
annotated_image = np.zeros_like(rgb_image)
# Loop through the detected hands to visualize.
for idx in range(len(hand_landmarks_list)):
hand_landmarks = hand_landmarks_list[idx]
handedness = handedness_list[idx]
# Draw the hand landmarks.
hand_landmarks_proto = landmark_pb2.NormalizedLandmarkList()
hand_landmarks_proto.landmark.extend([
landmark_pb2.NormalizedLandmark(x=landmark.x, y=landmark.y, z=landmark.z) for landmark in hand_landmarks
])
solutions.drawing_utils.draw_landmarks(
annotated_image,
hand_landmarks_proto,
solutions.hands.HAND_CONNECTIONS,
solutions.drawing_styles.get_default_hand_landmarks_style(),
solutions.drawing_styles.get_default_hand_connections_style())
return annotated_image
# Create an HandLandmarker object.
base_options = python.BaseOptions(model_asset_path='hand_landmarker.task')
options = vision.HandLandmarkerOptions(base_options=base_options,
num_hands=2)
detector = vision.HandLandmarker.create_from_options(options)
# Load the input image.
image = np.asarray(Image.open("./test.png"))
image = mp.Image(
image_format=mp.ImageFormat.SRGB, data=image
)
# Detect hand landmarks from the input image.
detection_result = detector.detect(image)
# Process the classification result and save it.
annotated_image = draw_landmarks_on_image(image.numpy_view(), detection_result)
cv2.imwrite("ann.png", cv2.cvtColor(annotated_image, cv2.COLOR_RGB2BGR))
``` |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/3b023594 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1337
dataset_size: 184
---
# Dataset Card for "3b023594"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/audioset_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
splits:
- name: original
num_bytes: 6377472402.0
num_examples: 20111
- name: academicodec_hifi_16k_320d
num_bytes: 6367615153.0
num_examples: 20111
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 6367615153.0
num_examples: 20111
- name: academicodec_hifi_24k_320d
num_bytes: 9562795313.0
num_examples: 20111
- name: audiodec_24k_320d
num_bytes: 9553853730.0
num_examples: 20111
- name: dac_16k
num_bytes: 6377489897.0
num_examples: 20111
- name: dac_24k
num_bytes: 9565601505.0
num_examples: 20111
- name: dac_44k
num_bytes: 17575747599.0
num_examples: 20111
- name: encodec_24k_12bps
num_bytes: 9565601505.0
num_examples: 20111
- name: encodec_24k_1_5bps
num_bytes: 9565601505.0
num_examples: 20111
- name: encodec_24k_24bps
num_bytes: 9565601505.0
num_examples: 20111
- name: encodec_24k_3bps
num_bytes: 9565601505.0
num_examples: 20111
- name: encodec_24k_6bps
num_bytes: 9565601505.0
num_examples: 20111
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 6373863275.0
num_examples: 20111
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 6373863275.0
num_examples: 20111
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 6377489897.0
num_examples: 20111
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 6377489897.0
num_examples: 20111
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 6377489897.0
num_examples: 20111
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 6377489897.0
num_examples: 20111
- name: speech_tokenizer_16k
num_bytes: 6380297393.0
num_examples: 20111
download_size: 160336452822
dataset_size: 164214181808.0
---
# Dataset Card for "audioset_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NadiaHolmlund/Japanese_Speech_Examples | ---
license: openrail
---
|
luisf1xc/data_drugs_class | ---
license: unknown
---
|
jeggers/wikipedia_paragraphs_length | ---
dataset_info:
features:
- name: text
dtype: string
- name: length
dtype: int64
- name: page_url
dtype: string
splits:
- name: train
num_bytes: 46950328.3
num_examples: 100000
download_size: 9766922
dataset_size: 46950328.3
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wikipedia_paragraphs_length"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgiaohc/twitter_dataset_1713178090 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20831
num_examples: 48
download_size: 10993
dataset_size: 20831
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Aniket231/my-data-repos | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 34697609
num_examples: 100000
- name: validation
num_bytes: 3474173
num_examples: 10000
download_size: 20309276
dataset_size: 38171782
---
# Dataset Card for "my-data-repos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
swaroopajit/next-dataset-refined-batch-3000 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 297746292.0
num_examples: 999
download_size: 268205162
dataset_size: 297746292.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "next-dataset-refined-batch-3000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sangjeong/testData3 | ---
license: openrail
---
|
mikewang/AwA2 | ---
pretty_name: 'Animals with Attributes v2 (AwA2)'
language:
- en
---
# Dataset Card for Animals with Attributes v2 (AwA2)
## Dataset Description
**Homepage:** https://cvml.ista.ac.at/AwA2/
**IMPORTANT NOTES**
- This HF dataset downloads the dataset (https://cvml.ista.ac.at/AwA2/AwA2-data.zip), and loads the image instances with class-level annotations.
- The "train" split in this HF dataset contains all the images. For the original proposed splits and the proposed splits version 2.0, please refer to [here](https://www.mpi-inf.mpg.de/departments/computer-vision-and-machine-learning/research/zero-shot-learning/zero-shot-learning-the-good-the-bad-and-the-ugly/).
- License files is also included in the downloaded dataset (https://cvml.ista.ac.at/AwA2/AwA2-data.zip)
**Paper Citation:**
```
@article{xian2018zero,
title={Zero-shot learning—a comprehensive evaluation of the good, the bad and the ugly},
author={Xian, Yongqin and Lampert, Christoph H and Schiele, Bernt and Akata, Zeynep},
journal={IEEE transactions on pattern analysis and machine intelligence},
volume={41},
number={9},
pages={2251--2265},
year={2018},
publisher={IEEE}
}
```
## Dataset Summary
This dataset provides a platform to benchmark transfer-learning algorithms, in particular attribute base classification and zero-shot learning [1]. It can act as a drop-in replacement to the original Animals with Attributes (AwA) dataset [2,3], as it has the same class structure and almost the same characteristics.
It consists of 37322 images of 50 animals classes with pre-extracted feature representations for each image. The classes are aligned with Osherson's classical class/attribute matrix [3,4], thereby providing 85 numeric attribute values for each class. Using the shared attributes, it is possible to transfer information between different classes.
The image data was collected from public sources, such as Flickr, in 2016. In the process we made sure to only include images that are licensed for free use and redistribution, please see the archive for the individual license files. If the dataset contains an image for which you hold the copyright and that was not licensed freely, please contact us at , so we can remove it from the collection.
**References**
[1] Y. Xian, C. H. Lampert, B. Schiele, Z. Akata. "Zero-Shot Learning - A Comprehensive Evaluation of the Good, the Bad and the Ugly", IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI) 40(8), 2018. (arXiv:1707.00600 [cs.CV])
[2] C. H. Lampert, H. Nickisch, and S. Harmeling. "Learning To Detect Unseen Object Classes by Between-Class Attribute Transfer". In CVPR, 2009
[3] C. H. Lampert, H. Nickisch, and S. Harmeling. "Attribute-Based Classification for Zero-Shot Visual Object Categorization". IEEE T-PAMI, 2013
[4] D. N. Osherson, J. Stern, O. Wilkie, M. Stob, and E. E. Smith. "Default probability". Cognitive Science, 15(2), 1991.
[5] C. Kemp, J. B. Tenenbaum, T. L. Griffiths, T. Yamada, and N. Ueda. "Learning systems of concepts with an infinite relational model". In AAAI, 2006. |
lucianoportela/Portelinha | ---
license: mit
---
|
desik98/TeluguRiddles | ---
annotations_creators:
- expert-generated
language:
- te
language_creators:
- expert-generated
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: Telugu Riddles
size_categories:
- n<1K
source_datasets:
- original
tags:
- riddles
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Summary
`TeluguRiddles` is an open source dataset of instruct-style records generated by webscraping multiple riddles websites. This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Apache 2.0](https://opensource.org/license/apache-2-0) License.
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Telugu Version: 1.0
# Dataset Overview
`TeluguRiddles` is a corpus of more than 800 records generated by webscraping multiple riddles websites. This Dataset can be used for the following task:
- Given the riddle, generate the answer for that riddle.
# Intended Uses
While immediately valuable for instruction fine tuning large language models, as a corpus of instruction prompts, this dataset also presents a valuable opportunity for synthetic data generation in the methods. For example, prompt-completions could be submitted as few-shot examples to a large open language model to generate additional riddles and their respective answers.
# Dataset
## Load with Datasets
To load this dataset with Datasets, you'll just need to install Datasets as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset('desik98/TeluguRiddles')
```
## Purpose of Collection
Telugu is a low-resource language where there no riddles and their answers generation instruct-style dataset to the best of my knowledge. This was created as a part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI to make sure Telugu is well represented in the space of AI/ML. Unlike other datasets that are limited to non-commercial use, this dataset can be used, modified, and extended for any purpose, including academic or commercial applications.
## Sources
- **Mutiple Riddles Websites**: Performed webscraping from [1](https://telugupatham.blogspot.com/p/podupu-kathalu.html), [2](http://www.maganti.org/podupu/podupu1.html), [3](https://teluguadda.co.in/podupu-kathalu-telugu-with-answers/), [4](http://palukuteniyalu.blogspot.com/2016/03/blog-post_17.html) and [5](http://mostusefulthings.blogspot.com/2011/06/blog-post.html) websites which consists of riddles of varying difficulties. Next, performed some pre-processing of the data like removing unwanted characters and bad riddles from the scraped data. Finally, converted the scraped data into Instruct-style prompts and completions.
## Data Fields
- `inputs` : Prompt or input to the language model.
- `targets` : Completion or output of the language model.
- `template_id` : Id of the template used in `inputs` and `targets`.
- `template_lang`: ISO code of the language used in the `inputs` and `targets` where *tel* refers to Telugu.
## Templates
For the creation of instruct-style prompts and completions from the scraped data, the following one template with 2 different templates were used:
1. Given Title/Headline of the article, generate the article with that Title/Headline.
| template_id | inputs | targets |
|-------------|--------|---------|
| 1 | ```ఈ రిడిల్ కి సమాధానం ఇవ్వు {{Riddle}}``` | ```మీరు అడిగిన రిడిల్ కి సమాధానం: {{Answer}}``` |
| 2 | ```ఈ పొడుపు కథ కి సమాధానం ఇవ్వు {{Riddle}}``` | ```మీరు అడిగిన పొడుపు కథ కి సమాధానం: {{Answer}}``` |
## Personal or Sensitive Data
This dataset contains public information. To our knowledge, there are no private person’s personal identifiers or sensitive information.
## Language
Telugu
# Known Limitations
- The Dataset is scraped from the mutiple riddle websites and the contents of this dataset may reflect the bias, factual errors, inappropriate and sensitive matters.
- Although there is utmost care taken to keep the dataset as monolingual, there might be some records that may contain English Language along with Telugu.
# Contributors
[Desik98](https://github.com/desik1998) and [SuryaKrishna02](https://github.com/SuryaKrishna02) |
CyberHarem/lass_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lass/ミニスカート (Pokémon)
This is the dataset of lass/ミニスカート (Pokémon), containing 83 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 83 | 71.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lass_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 83 | 42.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lass_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 194 | 91.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lass_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 83 | 64.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lass_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 194 | 122.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lass_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lass_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, open_mouth, school_uniform, solo, white_shirt, collared_shirt, long_sleeves, open_jacket, looking_at_viewer, black_pantyhose, blazer, holding_poke_ball, pleated_skirt, red_jacket, black_skirt, poke_ball_(basic), standing, simple_background, teeth, :d, black_necktie, miniskirt, white_background, blush, shoes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | school_uniform | solo | white_shirt | collared_shirt | long_sleeves | open_jacket | looking_at_viewer | black_pantyhose | blazer | holding_poke_ball | pleated_skirt | red_jacket | black_skirt | poke_ball_(basic) | standing | simple_background | teeth | :d | black_necktie | miniskirt | white_background | blush | shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-----------------|:-------|:--------------|:-----------------|:---------------|:--------------|:--------------------|:------------------|:---------|:--------------------|:----------------|:-------------|:--------------|:--------------------|:-----------|:--------------------|:--------|:-----|:----------------|:------------|:-------------------|:--------|:--------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_AA051615__A0221 | ---
pretty_name: Evaluation run of AA051615/A0221
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051615/A0221](https://huggingface.co/AA051615/A0221) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051615__A0221\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T17:29:48.880336](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__A0221/blob/main/results_2024-02-21T17-29-48.880336.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.83358630360451,\n\
\ \"acc_stderr\": 0.024577594482369926,\n \"acc_norm\": 0.8421603154382172,\n\
\ \"acc_norm_stderr\": 0.02496818685429535,\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5512834343620121,\n\
\ \"mc2_stderr\": 0.01560178984974555\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382504,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6553475403306114,\n\
\ \"acc_stderr\": 0.00474283530976365,\n \"acc_norm\": 0.8513244373630751,\n\
\ \"acc_norm_stderr\": 0.0035504128916474466\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.0335567721631314,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.0335567721631314\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9144736842105263,\n \"acc_stderr\": 0.022758677130888604,\n\
\ \"acc_norm\": 0.9144736842105263,\n \"acc_norm_stderr\": 0.022758677130888604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.86,\n\
\ \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \
\ \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8754716981132076,\n \"acc_stderr\": 0.020321376630696233,\n\
\ \"acc_norm\": 0.8754716981132076,\n \"acc_norm_stderr\": 0.020321376630696233\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9583333333333334,\n\
\ \"acc_stderr\": 0.01671031580295997,\n \"acc_norm\": 0.9583333333333334,\n\
\ \"acc_norm_stderr\": 0.01671031580295997\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8208092485549133,\n\
\ \"acc_stderr\": 0.029242513059063294,\n \"acc_norm\": 0.8208092485549133,\n\
\ \"acc_norm_stderr\": 0.029242513059063294\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.04655010411319609,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.04655010411319609\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8978723404255319,\n \"acc_stderr\": 0.019795708842206803,\n\
\ \"acc_norm\": 0.8978723404255319,\n \"acc_norm_stderr\": 0.019795708842206803\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7368421052631579,\n\
\ \"acc_stderr\": 0.04142439719489368,\n \"acc_norm\": 0.7368421052631579,\n\
\ \"acc_norm_stderr\": 0.04142439719489368\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8482758620689655,\n \"acc_stderr\": 0.029896107594574617,\n\
\ \"acc_norm\": 0.8482758620689655,\n \"acc_norm_stderr\": 0.029896107594574617\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7857142857142857,\n \"acc_stderr\": 0.021132859182754447,\n \"\
acc_norm\": 0.7857142857142857,\n \"acc_norm_stderr\": 0.021132859182754447\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n\
\ \"acc_stderr\": 0.043758884927270585,\n \"acc_norm\": 0.6031746031746031,\n\
\ \"acc_norm_stderr\": 0.043758884927270585\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9387096774193548,\n \"acc_stderr\": 0.013645277160910884,\n \"\
acc_norm\": 0.9387096774193548,\n \"acc_norm_stderr\": 0.013645277160910884\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.8029556650246306,\n \"acc_stderr\": 0.027986724666736223,\n \"\
acc_norm\": 0.8029556650246306,\n \"acc_norm_stderr\": 0.027986724666736223\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\"\
: 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.022448399923854282,\n\
\ \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.022448399923854282\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723333,\n \"\
acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909042,\n\
\ \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909042\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8692307692307693,\n \"acc_stderr\": 0.017094072023289643,\n\
\ \"acc_norm\": 0.8692307692307693,\n \"acc_norm_stderr\": 0.017094072023289643\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.6888888888888889,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.9327731092436975,\n \"acc_stderr\": 0.01626617155929388,\n \
\ \"acc_norm\": 0.9327731092436975,\n \"acc_norm_stderr\": 0.01626617155929388\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.6490066225165563,\n \"acc_stderr\": 0.03896981964257374,\n \"\
acc_norm\": 0.6490066225165563,\n \"acc_norm_stderr\": 0.03896981964257374\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9522935779816514,\n \"acc_stderr\": 0.009138489155094909,\n \"\
acc_norm\": 0.9522935779816514,\n \"acc_norm_stderr\": 0.009138489155094909\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7824074074074074,\n \"acc_stderr\": 0.02813968944485967,\n \"\
acc_norm\": 0.7824074074074074,\n \"acc_norm_stderr\": 0.02813968944485967\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9656862745098039,\n \"acc_stderr\": 0.012776266045095932,\n \"\
acc_norm\": 0.9656862745098039,\n \"acc_norm_stderr\": 0.012776266045095932\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9451476793248945,\n \"acc_stderr\": 0.014821471997344062,\n \
\ \"acc_norm\": 0.9451476793248945,\n \"acc_norm_stderr\": 0.014821471997344062\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8789237668161435,\n\
\ \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.8789237668161435,\n\
\ \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n\
\ \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9421487603305785,\n \"acc_stderr\": 0.021312061087979534,\n \"\
acc_norm\": 0.9421487603305785,\n \"acc_norm_stderr\": 0.021312061087979534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9259259259259259,\n\
\ \"acc_stderr\": 0.025317997297209727,\n \"acc_norm\": 0.9259259259259259,\n\
\ \"acc_norm_stderr\": 0.025317997297209727\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9325153374233128,\n \"acc_stderr\": 0.01970938281499789,\n\
\ \"acc_norm\": 0.9325153374233128,\n \"acc_norm_stderr\": 0.01970938281499789\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6964285714285714,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.6964285714285714,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n\
\ \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9572649572649573,\n\
\ \"acc_stderr\": 0.013250436685245014,\n \"acc_norm\": 0.9572649572649573,\n\
\ \"acc_norm_stderr\": 0.013250436685245014\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.946360153256705,\n\
\ \"acc_stderr\": 0.008056911822364876,\n \"acc_norm\": 0.946360153256705,\n\
\ \"acc_norm_stderr\": 0.008056911822364876\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8439306358381503,\n \"acc_stderr\": 0.019539014685374036,\n\
\ \"acc_norm\": 0.8439306358381503,\n \"acc_norm_stderr\": 0.019539014685374036\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8547486033519553,\n\
\ \"acc_stderr\": 0.011784484757787555,\n \"acc_norm\": 0.8547486033519553,\n\
\ \"acc_norm_stderr\": 0.011784484757787555\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.9052287581699346,\n \"acc_stderr\": 0.01677133127183646,\n\
\ \"acc_norm\": 0.9052287581699346,\n \"acc_norm_stderr\": 0.01677133127183646\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8778135048231511,\n\
\ \"acc_stderr\": 0.018600811252967923,\n \"acc_norm\": 0.8778135048231511,\n\
\ \"acc_norm_stderr\": 0.018600811252967923\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8950617283950617,\n \"acc_stderr\": 0.0170526620818853,\n\
\ \"acc_norm\": 0.8950617283950617,\n \"acc_norm_stderr\": 0.0170526620818853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.7163120567375887,\n \"acc_stderr\": 0.026891709428343957,\n \
\ \"acc_norm\": 0.7163120567375887,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.7822685788787483,\n\
\ \"acc_stderr\": 0.01054065064249993,\n \"acc_norm\": 0.7822685788787483,\n\
\ \"acc_norm_stderr\": 0.01054065064249993\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.9154411764705882,\n \"acc_stderr\": 0.016900908171490606,\n\
\ \"acc_norm\": 0.9154411764705882,\n \"acc_norm_stderr\": 0.016900908171490606\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8872549019607843,\n \"acc_stderr\": 0.012795357747288058,\n \
\ \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.012795357747288058\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8090909090909091,\n\
\ \"acc_stderr\": 0.03764425585984927,\n \"acc_norm\": 0.8090909090909091,\n\
\ \"acc_norm_stderr\": 0.03764425585984927\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8816326530612245,\n \"acc_stderr\": 0.02068068296798584,\n\
\ \"acc_norm\": 0.8816326530612245,\n \"acc_norm_stderr\": 0.02068068296798584\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9353233830845771,\n\
\ \"acc_stderr\": 0.017391600291491064,\n \"acc_norm\": 0.9353233830845771,\n\
\ \"acc_norm_stderr\": 0.017391600291491064\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.97,\n \"acc_stderr\": 0.01714466079977652,\n \
\ \"acc_norm\": 0.97,\n \"acc_norm_stderr\": 0.01714466079977652\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6385542168674698,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.6385542168674698,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.935672514619883,\n \"acc_stderr\": 0.018816366468768296,\n\
\ \"acc_norm\": 0.935672514619883,\n \"acc_norm_stderr\": 0.018816366468768296\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5512834343620121,\n\
\ \"mc2_stderr\": 0.01560178984974555\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242914\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5784685367702805,\n \
\ \"acc_stderr\": 0.013601824409483262\n }\n}\n```"
repo_url: https://huggingface.co/AA051615/A0221
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|arc:challenge|25_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|gsm8k|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hellaswag|10_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T17-29-48.880336.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T17-29-48.880336.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- '**/details_harness|winogrande|5_2024-02-21T17-29-48.880336.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T17-29-48.880336.parquet'
- config_name: results
data_files:
- split: 2024_02_21T17_29_48.880336
path:
- results_2024-02-21T17-29-48.880336.parquet
- split: latest
path:
- results_2024-02-21T17-29-48.880336.parquet
---
# Dataset Card for Evaluation run of AA051615/A0221
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051615/A0221](https://huggingface.co/AA051615/A0221) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051615__A0221",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T17:29:48.880336](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051615__A0221/blob/main/results_2024-02-21T17-29-48.880336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.83358630360451,
"acc_stderr": 0.024577594482369926,
"acc_norm": 0.8421603154382172,
"acc_norm_stderr": 0.02496818685429535,
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5512834343620121,
"mc2_stderr": 0.01560178984974555
},
"harness|arc:challenge|25": {
"acc": 0.6510238907849829,
"acc_stderr": 0.013928933461382504,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6553475403306114,
"acc_stderr": 0.00474283530976365,
"acc_norm": 0.8513244373630751,
"acc_norm_stderr": 0.0035504128916474466
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.0335567721631314,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.0335567721631314
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9144736842105263,
"acc_stderr": 0.022758677130888604,
"acc_norm": 0.9144736842105263,
"acc_norm_stderr": 0.022758677130888604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8754716981132076,
"acc_stderr": 0.020321376630696233,
"acc_norm": 0.8754716981132076,
"acc_norm_stderr": 0.020321376630696233
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9583333333333334,
"acc_stderr": 0.01671031580295997,
"acc_norm": 0.9583333333333334,
"acc_norm_stderr": 0.01671031580295997
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.029242513059063294,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.029242513059063294
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.04655010411319609,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.04655010411319609
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8978723404255319,
"acc_stderr": 0.019795708842206803,
"acc_norm": 0.8978723404255319,
"acc_norm_stderr": 0.019795708842206803
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.04142439719489368,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.04142439719489368
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8482758620689655,
"acc_stderr": 0.029896107594574617,
"acc_norm": 0.8482758620689655,
"acc_norm_stderr": 0.029896107594574617
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7857142857142857,
"acc_stderr": 0.021132859182754447,
"acc_norm": 0.7857142857142857,
"acc_norm_stderr": 0.021132859182754447
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.043758884927270585,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.043758884927270585
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9387096774193548,
"acc_stderr": 0.013645277160910884,
"acc_norm": 0.9387096774193548,
"acc_norm_stderr": 0.013645277160910884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.8029556650246306,
"acc_stderr": 0.027986724666736223,
"acc_norm": 0.8029556650246306,
"acc_norm_stderr": 0.027986724666736223
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.022448399923854282,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.022448399923854282
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723333,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909042,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909042
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8692307692307693,
"acc_stderr": 0.017094072023289643,
"acc_norm": 0.8692307692307693,
"acc_norm_stderr": 0.017094072023289643
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.9327731092436975,
"acc_stderr": 0.01626617155929388,
"acc_norm": 0.9327731092436975,
"acc_norm_stderr": 0.01626617155929388
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6490066225165563,
"acc_stderr": 0.03896981964257374,
"acc_norm": 0.6490066225165563,
"acc_norm_stderr": 0.03896981964257374
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9522935779816514,
"acc_stderr": 0.009138489155094909,
"acc_norm": 0.9522935779816514,
"acc_norm_stderr": 0.009138489155094909
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7824074074074074,
"acc_stderr": 0.02813968944485967,
"acc_norm": 0.7824074074074074,
"acc_norm_stderr": 0.02813968944485967
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9656862745098039,
"acc_stderr": 0.012776266045095932,
"acc_norm": 0.9656862745098039,
"acc_norm_stderr": 0.012776266045095932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9451476793248945,
"acc_stderr": 0.014821471997344062,
"acc_norm": 0.9451476793248945,
"acc_norm_stderr": 0.014821471997344062
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8789237668161435,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.8789237668161435,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9083969465648855,
"acc_stderr": 0.025300035578642962,
"acc_norm": 0.9083969465648855,
"acc_norm_stderr": 0.025300035578642962
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9421487603305785,
"acc_stderr": 0.021312061087979534,
"acc_norm": 0.9421487603305785,
"acc_norm_stderr": 0.021312061087979534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9259259259259259,
"acc_stderr": 0.025317997297209727,
"acc_norm": 0.9259259259259259,
"acc_norm_stderr": 0.025317997297209727
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9325153374233128,
"acc_stderr": 0.01970938281499789,
"acc_norm": 0.9325153374233128,
"acc_norm_stderr": 0.01970938281499789
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6964285714285714,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.6964285714285714,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9572649572649573,
"acc_stderr": 0.013250436685245014,
"acc_norm": 0.9572649572649573,
"acc_norm_stderr": 0.013250436685245014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.946360153256705,
"acc_stderr": 0.008056911822364876,
"acc_norm": 0.946360153256705,
"acc_norm_stderr": 0.008056911822364876
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8439306358381503,
"acc_stderr": 0.019539014685374036,
"acc_norm": 0.8439306358381503,
"acc_norm_stderr": 0.019539014685374036
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8547486033519553,
"acc_stderr": 0.011784484757787555,
"acc_norm": 0.8547486033519553,
"acc_norm_stderr": 0.011784484757787555
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9052287581699346,
"acc_stderr": 0.01677133127183646,
"acc_norm": 0.9052287581699346,
"acc_norm_stderr": 0.01677133127183646
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8778135048231511,
"acc_stderr": 0.018600811252967923,
"acc_norm": 0.8778135048231511,
"acc_norm_stderr": 0.018600811252967923
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8950617283950617,
"acc_stderr": 0.0170526620818853,
"acc_norm": 0.8950617283950617,
"acc_norm_stderr": 0.0170526620818853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7163120567375887,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.7163120567375887,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.7822685788787483,
"acc_stderr": 0.01054065064249993,
"acc_norm": 0.7822685788787483,
"acc_norm_stderr": 0.01054065064249993
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9154411764705882,
"acc_stderr": 0.016900908171490606,
"acc_norm": 0.9154411764705882,
"acc_norm_stderr": 0.016900908171490606
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.012795357747288058,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.012795357747288058
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8090909090909091,
"acc_stderr": 0.03764425585984927,
"acc_norm": 0.8090909090909091,
"acc_norm_stderr": 0.03764425585984927
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8816326530612245,
"acc_stderr": 0.02068068296798584,
"acc_norm": 0.8816326530612245,
"acc_norm_stderr": 0.02068068296798584
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9353233830845771,
"acc_stderr": 0.017391600291491064,
"acc_norm": 0.9353233830845771,
"acc_norm_stderr": 0.017391600291491064
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.97,
"acc_stderr": 0.01714466079977652,
"acc_norm": 0.97,
"acc_norm_stderr": 0.01714466079977652
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6385542168674698,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.6385542168674698,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.935672514619883,
"acc_stderr": 0.018816366468768296,
"acc_norm": 0.935672514619883,
"acc_norm_stderr": 0.018816366468768296
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5512834343620121,
"mc2_stderr": 0.01560178984974555
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242914
},
"harness|gsm8k|5": {
"acc": 0.5784685367702805,
"acc_stderr": 0.013601824409483262
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rthatha/GLDv2-All-Caption-Monza | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 109011903.0
num_examples: 203
download_size: 108833302
dataset_size: 109011903.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
parambharat/bengali_asr_corpus | ---
annotations_creators:
- found
language:
- bn
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Bengali ASR Corpus
size_categories:
- 100K<n<1M
source_datasets:
- extended|openslr
tags: []
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for [Bengali Asr Corpus]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@parambharat](https://github.com/parambharat) for adding this dataset. |
distilled-from-one-sec-cv12/chunk_175 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 979447332
num_examples: 190851
download_size: 1000551510
dataset_size: 979447332
---
# Dataset Card for "chunk_175"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqudupstack-gaming | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 105494
num_examples: 1595
- name: corpus
num_bytes: 20666596
num_examples: 45301
download_size: 12946080
dataset_size: 20772090
---
# Dataset Card for "cqudupstack-gaming"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tsac | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- aeb
license:
- lgpl-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
paperswithcode_id: tsac
pretty_name: Tunisian Sentiment Analysis Corpus
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: target
dtype:
class_label:
names:
'0': '1'
'1': '-1'
splits:
- name: train
num_bytes: 1020146
num_examples: 13669
- name: test
num_bytes: 268504
num_examples: 3400
download_size: 963015
dataset_size: 1288650
---
# Dataset Card for Tunisian Sentiment Analysis Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** None
- **Repository:** https://github.com/fbougares/TSAC
- **Paper:** https://www.aclweb.org/anthology/W17-1307
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** Salima Mdhaffar (firstname.lastname@univ-lemans.fr)
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
open-llm-leaderboard/details_openBuddy__openbuddy-llama2-34b-v11.1-bf16 | ---
pretty_name: Evaluation run of openBuddy/openbuddy-llama2-34b-v11.1-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openBuddy/openbuddy-llama2-34b-v11.1-bf16](https://huggingface.co/openBuddy/openbuddy-llama2-34b-v11.1-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openBuddy__openbuddy-llama2-34b-v11.1-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T15:31:04.396852](https://huggingface.co/datasets/open-llm-leaderboard/details_openBuddy__openbuddy-llama2-34b-v11.1-bf16/blob/main/results_2023-10-24T15-31-04.396852.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.360633389261745,\n\
\ \"em_stderr\": 0.004917536525106699,\n \"f1\": 0.4180935402684579,\n\
\ \"f1_stderr\": 0.004778710905980245,\n \"acc\": 0.5268440191410464,\n\
\ \"acc_stderr\": 0.012939810741097795\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.360633389261745,\n \"em_stderr\": 0.004917536525106699,\n\
\ \"f1\": 0.4180935402684579,\n \"f1_stderr\": 0.004778710905980245\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3457164518574678,\n \
\ \"acc_stderr\": 0.013100422990441578\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7079715864246251,\n \"acc_stderr\": 0.012779198491754013\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openBuddy/openbuddy-llama2-34b-v11.1-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|arc:challenge|25_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T13_56_54.496754
path:
- '**/details_harness|drop|3_2023-10-24T13-56-54.496754.parquet'
- split: 2023_10_24T15_31_04.396852
path:
- '**/details_harness|drop|3_2023-10-24T15-31-04.396852.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T15-31-04.396852.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T13_56_54.496754
path:
- '**/details_harness|gsm8k|5_2023-10-24T13-56-54.496754.parquet'
- split: 2023_10_24T15_31_04.396852
path:
- '**/details_harness|gsm8k|5_2023-10-24T15-31-04.396852.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T15-31-04.396852.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hellaswag|10_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-53-35.640501.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T11-53-35.640501.parquet'
- split: 2023_09_13T12_14_53.531149
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-14-53.531149.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-14-53.531149.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T13_56_54.496754
path:
- '**/details_harness|winogrande|5_2023-10-24T13-56-54.496754.parquet'
- split: 2023_10_24T15_31_04.396852
path:
- '**/details_harness|winogrande|5_2023-10-24T15-31-04.396852.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T15-31-04.396852.parquet'
- config_name: results
data_files:
- split: 2023_09_13T11_53_35.640501
path:
- results_2023-09-13T11-53-35.640501.parquet
- split: 2023_09_13T12_14_53.531149
path:
- results_2023-09-13T12-14-53.531149.parquet
- split: 2023_10_24T13_56_54.496754
path:
- results_2023-10-24T13-56-54.496754.parquet
- split: 2023_10_24T15_31_04.396852
path:
- results_2023-10-24T15-31-04.396852.parquet
- split: latest
path:
- results_2023-10-24T15-31-04.396852.parquet
---
# Dataset Card for Evaluation run of openBuddy/openbuddy-llama2-34b-v11.1-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openBuddy/openbuddy-llama2-34b-v11.1-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openBuddy/openbuddy-llama2-34b-v11.1-bf16](https://huggingface.co/openBuddy/openbuddy-llama2-34b-v11.1-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openBuddy__openbuddy-llama2-34b-v11.1-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T15:31:04.396852](https://huggingface.co/datasets/open-llm-leaderboard/details_openBuddy__openbuddy-llama2-34b-v11.1-bf16/blob/main/results_2023-10-24T15-31-04.396852.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.360633389261745,
"em_stderr": 0.004917536525106699,
"f1": 0.4180935402684579,
"f1_stderr": 0.004778710905980245,
"acc": 0.5268440191410464,
"acc_stderr": 0.012939810741097795
},
"harness|drop|3": {
"em": 0.360633389261745,
"em_stderr": 0.004917536525106699,
"f1": 0.4180935402684579,
"f1_stderr": 0.004778710905980245
},
"harness|gsm8k|5": {
"acc": 0.3457164518574678,
"acc_stderr": 0.013100422990441578
},
"harness|winogrande|5": {
"acc": 0.7079715864246251,
"acc_stderr": 0.012779198491754013
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zolak/twitter_dataset_79_1713161442 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 338947
num_examples: 843
download_size: 178325
dataset_size: 338947
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/fwv2_random_rare_train_1000_eval_100 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 218225
num_examples: 2100
- name: train_doc2id
num_bytes: 100243
num_examples: 1100
- name: train_id2doc
num_bytes: 103543
num_examples: 1100
- name: train_find_word
num_bytes: 114682
num_examples: 1000
- name: eval_find_word
num_bytes: 11342
num_examples: 100
- name: id_context_mapping
num_bytes: 68343
num_examples: 1100
download_size: 0
dataset_size: 616378
---
# Dataset Card for "fwv2_random_rare_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
santoshtyss/billsum | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 186689203
num_examples: 16107
- name: test
num_bytes: 37866257
num_examples: 3269
- name: ca_test
num_bytes: 14945291
num_examples: 1237
- name: validation
num_bytes: 32906887
num_examples: 2842
download_size: 113748846
dataset_size: 272407638
---
# Dataset Card for "billsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tiennv/vietnamese-corpus | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8142342251
num_examples: 19233991
download_size: 4233458271
dataset_size: 8142342251
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vietnamese-corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
on1onmangoes/First11VoiceHarmonyEmbeddings071523GoodVersion | ---
dataset_info:
features:
- name: speaker_id
dtype: string
- name: embeddings
sequence:
sequence: float32
splits:
- name: train
num_bytes: 22829
num_examples: 11
download_size: 33622
dataset_size: 22829
---
# Dataset Card for "First11VoiceHarmonyEmbeddings071523GoodVersion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Leon-LLM/Leon-Chess-Dataset-71k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 38778712
num_examples: 71641
download_size: 19940618
dataset_size: 38778712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Leon-Chess-Dataset-71k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_26 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 866533068.0
num_examples: 168849
download_size: 887421386
dataset_size: 866533068.0
---
# Dataset Card for "chunk_26"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Arjun-G-Ravi/Python-codes | ---
license: mit
task_categories:
- text-generation
- text2text-generation
language:
- en
tags:
- code
pretty_name: Python codes dataset
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
Please note that this dataset maynot be perfect and may contain a very small quantity of non python codes. But the quantity appears to be very small
### Dataset Summary
The dataset contains a collection of python question and their code. This is meant to be used for training models to be efficient in Python specific coding.
The dataset has two features - 'question' and 'code'.
An example is:
```
{'question': 'Create a function that takes in a string and counts the number of vowels in it',
'code': 'def count_vowels(string):\n vowels = ["a", "e", "i", "o", "u"]\n count = 0\n for char in string:\n if char in vowels:\n count += 1\n return count'}
```
### Languages
English, Python
### Source Data
The dataset is derived from two other coding based datasets:
1) sahil2801/CodeAlpaca-20k
2) neulab/conala
@inproceedings{yin2018learning,
title={Learning to mine aligned code and natural language pairs from stack overflow},
author={Yin, Pengcheng and Deng, Bowen and Chen, Edgar and Vasilescu, Bogdan and Neubig, Graham},
booktitle={2018 IEEE/ACM 15th international conference on mining software repositories (MSR)},
pages={476--486},
year={2018},
organization={IEEE}
}
### Licensing Information
This uses MIT licence
### Citation Information
Will be added soon
|
mrcaelumn/yelp_restaurant_review_labelled | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 2282684498
num_examples: 4111534
- name: test
num_bytes: 571038991
num_examples: 1027884
download_size: 0
dataset_size: 2853723489
---
# Dataset Card for "yelp_restaurant_review_labelled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# More info about the dataset
dataset downloaded from [Yelp](https://www.yelp.com/dataset/download)
### labelling
if review star < 3 is 0 (negative)\
else if review star == 3 is 1 (neutral)\
else if review star > 3 is 2 (positive) |
2nayun/trash1 | ---
license: openrail
---
|
CyberHarem/shirayuki_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shirayuki (Kantai Collection)
This is the dataset of shirayuki (Kantai Collection), containing 373 images and their tags.
The core tags of this character are `brown_hair, twintails, brown_eyes, low_twintails, short_hair, short_twintails, bangs, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 373 | 245.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 373 | 183.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 682 | 330.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 373 | 231.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 682 | 400.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shirayuki_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | blue_skirt, pleated_skirt, serafuku, solo_focus, blue_sailor_collar, shirt, short_sleeves, open_mouth, 3girls, smile, 2girls, long_hair, neckerchief |
| 1 | 5 |  |  |  |  |  | pleated_skirt, serafuku, solo_focus, 2girls, sitting, smile, blush, open_mouth |
| 2 | 33 |  |  |  |  |  | 1girl, blue_sailor_collar, neckerchief, serafuku, solo, collared_shirt, simple_background, white_background, looking_at_viewer, blue_skirt, pleated_skirt, short_sleeves, smile, upper_body, one-hour_drawing_challenge |
| 3 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, sitting, smile, socks, solo, blush, pleated_skirt, neckerchief |
| 4 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_dress, enmaided, smile, blush, cowboy_shot, maid_headdress, simple_background, white_apron, white_background, breasts, frilled_apron, hair_between_eyes, puffy_short_sleeves, twitter_username, closed_mouth, gradient_background, one-hour_drawing_challenge, open_mouth, white_gloves |
| 5 | 8 |  |  |  |  |  | 1girl, solo, underwear_only, navel, looking_at_viewer, small_breasts, blush, collarbone, standing, white_panties, full_body, white_bra, barefoot, open_mouth, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blue_skirt | pleated_skirt | serafuku | solo_focus | blue_sailor_collar | shirt | short_sleeves | open_mouth | 3girls | smile | 2girls | long_hair | neckerchief | sitting | blush | 1girl | solo | collared_shirt | simple_background | white_background | looking_at_viewer | upper_body | one-hour_drawing_challenge | socks | black_dress | enmaided | cowboy_shot | maid_headdress | white_apron | breasts | frilled_apron | hair_between_eyes | puffy_short_sleeves | twitter_username | closed_mouth | gradient_background | white_gloves | underwear_only | navel | small_breasts | collarbone | standing | white_panties | full_body | white_bra | barefoot |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------|:----------------|:-----------|:-------------|:---------------------|:--------|:----------------|:-------------|:---------|:--------|:---------|:------------|:--------------|:----------|:--------|:--------|:-------|:-----------------|:--------------------|:-------------------|:--------------------|:-------------|:-----------------------------|:--------|:--------------|:-----------|:--------------|:-----------------|:--------------|:----------|:----------------|:--------------------|:----------------------|:-------------------|:---------------|:----------------------|:---------------|:-----------------|:--------|:----------------|:-------------|:-----------|:----------------|:------------|:------------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | | X | X | X | | | | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 33 |  |  |  |  |  | X | X | X | | X | | X | | | X | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | X | X | | | | | | | X | | | X | X | X | X | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | | | | | | | | X | | X | | | | | X | X | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | | | | | | | | X | | | | | | | X | X | X | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
cidtd-mod-ua/WizardLM-ukrainian | ---
dataset_info:
features:
- name: idx
dtype: string
- name: input
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 565345741
num_examples: 142801
download_size: 254868629
dataset_size: 565345741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
language:
- uk
pretty_name: WizardLM Ukraininan version
size_categories:
- 100K<n<1M
---
# WizardLM Translated to Ukrainian 🇺🇦
## Dataset Description
A Ukrainian language dataset comprising 140,000+ records translated from the [WizardLM](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k) dataset.
This dataset is suitable for various natural language processing tasks.
**This is not merged with original ShareGPT threads.**
Data translated via using Google Gemini Pro API.
Слава Україні!
## Disclaimer
Prepare data before your usage. There are some errors in texts, so be carefull.
## How to Use
This dataset can be loaded using the Hugging Face Datasets library:
```python
from datasets import load_dataset
dataset = load_dataset('cidtd-mod-ua/WizardLM-ukrainian')
```
# Citation
```bibtex
@misc{WizardLM-ukrainian,
title = {WizardLM - translation of WizardLM},
author = {Center of Innovations and Defence Technologies Development of Ministry of Defence of Ukraine},
year = {2024},
publisher = {HuggingFace},
url = {https://huggingface.co/datasets/cidtd-mod-ua/WizardLM-ukrainian/}
}
```
# Citations of original dataset
```bibtex
@misc{WizardLM/WizardLM_evol_instruct_V2_196k,
title = {WizardLM evol instruct dataset},
author = {WizardLM},
year = {2023},
publisher = {HuggingFace},
url = {https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k}
}
|
Jairool/jairo | ---
license: openrail
---
|
open-llm-leaderboard/details_deepseek-ai__deepseek-coder-7b-instruct-v1.5 | ---
pretty_name: Evaluation run of deepseek-ai/deepseek-coder-7b-instruct-v1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [deepseek-ai/deepseek-coder-7b-instruct-v1.5](https://huggingface.co/deepseek-ai/deepseek-coder-7b-instruct-v1.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-coder-7b-instruct-v1.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T13:06:15.255477](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-7b-instruct-v1.5/blob/main/results_2024-02-18T13-06-15.255477.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5022066458087141,\n\
\ \"acc_stderr\": 0.03488325466487485,\n \"acc_norm\": 0.507929969563447,\n\
\ \"acc_norm_stderr\": 0.03564529922081144,\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.4673350397051116,\n\
\ \"mc2_stderr\": 0.015209098662952647\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46331058020477817,\n \"acc_stderr\": 0.014572000527756994,\n\
\ \"acc_norm\": 0.4854948805460751,\n \"acc_norm_stderr\": 0.014605241081370053\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5399322844054969,\n\
\ \"acc_stderr\": 0.004973842670559797,\n \"acc_norm\": 0.7234614618601872,\n\
\ \"acc_norm_stderr\": 0.00446372107131909\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723456,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723456\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594962,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594962\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.041042692118062316,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.041042692118062316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727062,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727062\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5387096774193548,\n \"acc_stderr\": 0.028358634859836935,\n \"\
acc_norm\": 0.5387096774193548,\n \"acc_norm_stderr\": 0.028358634859836935\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5854922279792746,\n \"acc_stderr\": 0.035553003195576686,\n\
\ \"acc_norm\": 0.5854922279792746,\n \"acc_norm_stderr\": 0.035553003195576686\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6844036697247706,\n \"acc_stderr\": 0.01992611751386967,\n \"\
acc_norm\": 0.6844036697247706,\n \"acc_norm_stderr\": 0.01992611751386967\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\"\
: 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5637254901960784,\n\
\ \"acc_stderr\": 0.03480693138457039,\n \"acc_norm\": 0.5637254901960784,\n\
\ \"acc_norm_stderr\": 0.03480693138457039\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185815,\n\
\ \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185815\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5201793721973094,\n\
\ \"acc_stderr\": 0.033530461674123005,\n \"acc_norm\": 0.5201793721973094,\n\
\ \"acc_norm_stderr\": 0.033530461674123005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.043482080516448585,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.043482080516448585\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.038470214204560246,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.038470214204560246\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689049,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689049\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.611749680715198,\n\
\ \"acc_stderr\": 0.01742767329554434,\n \"acc_norm\": 0.611749680715198,\n\
\ \"acc_norm_stderr\": 0.01742767329554434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.02691504735536981,\n\
\ \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.02691504735536981\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.358659217877095,\n\
\ \"acc_stderr\": 0.016040454426164464,\n \"acc_norm\": 0.358659217877095,\n\
\ \"acc_norm_stderr\": 0.016040454426164464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280545,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280545\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n\
\ \"acc_stderr\": 0.028394421370984545,\n \"acc_norm\": 0.5080385852090032,\n\
\ \"acc_norm_stderr\": 0.028394421370984545\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.027744313443376536,\n\
\ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.027744313443376536\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35919165580182527,\n\
\ \"acc_stderr\": 0.012253386187584243,\n \"acc_norm\": 0.35919165580182527,\n\
\ \"acc_norm_stderr\": 0.012253386187584243\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.033455630703391914,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.033455630703391914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n\
\ \"mc1_stderr\": 0.016203316673559696,\n \"mc2\": 0.4673350397051116,\n\
\ \"mc2_stderr\": 0.015209098662952647\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964659\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20394238059135708,\n \
\ \"acc_stderr\": 0.01109860228489918\n }\n}\n```"
repo_url: https://huggingface.co/deepseek-ai/deepseek-coder-7b-instruct-v1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|arc:challenge|25_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|gsm8k|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hellaswag|10_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T13-06-15.255477.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T13-06-15.255477.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- '**/details_harness|winogrande|5_2024-02-18T13-06-15.255477.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T13-06-15.255477.parquet'
- config_name: results
data_files:
- split: 2024_02_18T13_06_15.255477
path:
- results_2024-02-18T13-06-15.255477.parquet
- split: latest
path:
- results_2024-02-18T13-06-15.255477.parquet
---
# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-7b-instruct-v1.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-coder-7b-instruct-v1.5](https://huggingface.co/deepseek-ai/deepseek-coder-7b-instruct-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-coder-7b-instruct-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T13:06:15.255477](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-7b-instruct-v1.5/blob/main/results_2024-02-18T13-06-15.255477.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5022066458087141,
"acc_stderr": 0.03488325466487485,
"acc_norm": 0.507929969563447,
"acc_norm_stderr": 0.03564529922081144,
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.4673350397051116,
"mc2_stderr": 0.015209098662952647
},
"harness|arc:challenge|25": {
"acc": 0.46331058020477817,
"acc_stderr": 0.014572000527756994,
"acc_norm": 0.4854948805460751,
"acc_norm_stderr": 0.014605241081370053
},
"harness|hellaswag|10": {
"acc": 0.5399322844054969,
"acc_stderr": 0.004973842670559797,
"acc_norm": 0.7234614618601872,
"acc_norm_stderr": 0.00446372107131909
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723456,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723456
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594962,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594962
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.041042692118062316,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.041042692118062316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727062,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727062
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5387096774193548,
"acc_stderr": 0.028358634859836935,
"acc_norm": 0.5387096774193548,
"acc_norm_stderr": 0.028358634859836935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5854922279792746,
"acc_stderr": 0.035553003195576686,
"acc_norm": 0.5854922279792746,
"acc_norm_stderr": 0.035553003195576686
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49230769230769234,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.49230769230769234,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6844036697247706,
"acc_stderr": 0.01992611751386967,
"acc_norm": 0.6844036697247706,
"acc_norm_stderr": 0.01992611751386967
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6455696202531646,
"acc_stderr": 0.031137304297185815,
"acc_norm": 0.6455696202531646,
"acc_norm_stderr": 0.031137304297185815
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5201793721973094,
"acc_stderr": 0.033530461674123005,
"acc_norm": 0.5201793721973094,
"acc_norm_stderr": 0.033530461674123005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.043482080516448585,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.043482080516448585
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.038470214204560246,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.038470214204560246
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689049,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689049
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.611749680715198,
"acc_stderr": 0.01742767329554434,
"acc_norm": 0.611749680715198,
"acc_norm_stderr": 0.01742767329554434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.02691504735536981,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.02691504735536981
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.358659217877095,
"acc_stderr": 0.016040454426164464,
"acc_norm": 0.358659217877095,
"acc_norm_stderr": 0.016040454426164464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280545,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280545
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.028394421370984545,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.028394421370984545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35919165580182527,
"acc_stderr": 0.012253386187584243,
"acc_norm": 0.35919165580182527,
"acc_norm_stderr": 0.012253386187584243
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.033455630703391914,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.033455630703391914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3108935128518972,
"mc1_stderr": 0.016203316673559696,
"mc2": 0.4673350397051116,
"mc2_stderr": 0.015209098662952647
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964659
},
"harness|gsm8k|5": {
"acc": 0.20394238059135708,
"acc_stderr": 0.01109860228489918
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hassan4830/urdu-binary-classification-data | ---
license: afl-3.0
---
This Urdu sentiment dataset was formed by concatenating the following two datasets:
https://github.com/MuhammadYaseenKhan/Urdu-Sentiment-Corpus
https://www.kaggle.com/datasets/akkefa/imdb-dataset-of-50k-movie-translated-urdu-reviews |
NiolasUa/Niji6 | ---
license: unknown
language:
- en
task_categories:
- text-to-image
tags:
- images
- Art
--- |
useSword/VAE_Default | ---
license: apache-2.0
---
|
BitTranslate/Bittensor_subnet_19_06_04_24 | ---
license: apache-2.0
---
|
pytorch-survival/kkbox | ---
dataset_info:
features:
- name: msno
dtype: string
- name: n_prev_churns
dtype: float32
- name: log_days_between_subs
dtype: float32
- name: log_days_since_reg_init
dtype: float32
- name: log_payment_plan_days
dtype: float32
- name: log_plan_list_price
dtype: float32
- name: log_actual_amount_paid
dtype: float32
- name: is_auto_renew
dtype: float32
- name: is_cancel
dtype: float32
- name: city
dtype: float64
- name: gender
dtype: string
- name: registered_via
dtype: float64
- name: age_at_start
dtype: float32
- name: strange_age
dtype: float32
- name: nan_days_since_reg_init
dtype: float32
- name: no_prev_churns
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: int64
splits:
- name: train
num_bytes: 236008040
num_examples: 1786358
download_size: 105130610
dataset_size: 236008040
---
# Dataset Card for "kkbox"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datamol-io/safe-gpt | ---
license: cc-by-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input
dtype: string
- name: mc_labels
sequence: float64
splits:
- name: train
num_bytes: 203939038678
num_examples: 945455307
- name: test
num_bytes: 25523244912
num_examples: 118890444
- name: validation
num_bytes: 24920275439
num_examples: 118451032
download_size: 270730145
dataset_size: 254382559029
---
# SAFE
Sequential Attachment-based Fragment Embedding (SAFE) is a novel molecular line notation that represents molecules as an unordered sequence of fragment blocks to improve molecule design using generative models.
Find the details and how to use at SAFE in the repo https://github.com/datamol-io/safe or the paper https://arxiv.org/pdf/2310.10773.pdf. |
CanariaView/GlobalCopperDemandForecastingDataset | ---
task_categories:
- time-series-forecasting
language:
- en
- ko
tags:
- mining
- LSTM
- TimeSeries
- CanariaView
---
# CanariaView Global Copper Demand Forecasting Dataset
## Description
This dataset encompasses economic and industrial indicators vital for constructing a copper demand forecasting model.
Coverage Period: Monthly data from January 1995 to March 2023, encompassing a total of 339 months.
Column Descriptions and Sources:
- `HSI_value (US Housing Starts Index)`: Y-Chart
- `CCI_value (Consumer Confidence Index)`: OECD
- `IPI_value (Industrial Production Total Index)`: FRED
- `GDPC_value (Real Gross Domestic Product)`: FRED
- `Copper price`: MacroTrends
Preprocessing Methodology and Data Collection Details:
- Comprehensive analysis of data structure followed by essential preprocessing.
- Appropriate handling of missing values.
- Daily and quarterly data uniformly expanded to a monthly timescale for consistency.
- Daily data (e.g., Copper price) and quarterly data (e.g., GDPC_value)
- Dependent variable data used in the model was available from 1995, guiding the collection of independent variables-this dataset- from that year.
## 한국어 설명
본 데이터셋은 구리 수요 예측 모델 구축을 위한 경제지표 및 산업지표로 구성되었습니다.
기간: 1995년 1월~2023년 3월(월별), 총 339개월.
컬럼 설명 및 출처
- `HSI_value (미국 주택착공지수)`: Y-Chart
- `CCI_value (미국 소비자신뢰지수)`: OECD
- `IPI_value (미국 산업생산자지수)`: FRED
- `GDPC_value (미국 실질 GDP)`: FRED
- `Copper price (구리 가격)`: MacroTrends
데이터 전처리 및 수집 방법:
- 데이터 구조 분석 및 전처리 과정 수행.
- 결측치 처리.
- 일별 및 분기별 자료는 월별 데이터로의 확장을 통해 일관된 시계열 데이터로 통합.
- 일별 자료 (구리 가격), 분기별 자료 (GDPC_value)
- 수요 모델에 사용된 종속변수 데이터가 1995년부터 확보되어 독립변수인 본 데이터셋도 1995년도부터 수집함. |
DataStudio/OCRWordLevelClear_07 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4665530815.72
num_examples: 1034148
download_size: 4456935622
dataset_size: 4665530815.72
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_68 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1116071436
num_examples: 217473
download_size: 1135596574
dataset_size: 1116071436
---
# Dataset Card for "chunk_68"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datahrvoje/twitter_dataset_1712959428 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21885
num_examples: 48
download_size: 13033
dataset_size: 21885
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_0_10000000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 190746
num_examples: 6699
download_size: 122266
dataset_size: 190746
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_0_10000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aintech/vdf_20240130_114906_3faa2_arxiv_abstracts |
---
tags:
- vdf
- vector-io
- vector-dataset
- vector-embeddings
---
This is a dataset created using [vector-io](https://github.com/ai-northstar-tech/vector-io)
|
ekinakyurek/ftrace | ---
language:
- en
license:
- cc-by-sa-4.0
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: FTRACE
size_categories:
- 1M<n<10M
source_datasets:
- TRex
- Lama
task_categories:
- influence-attribution
- information-retrieval
- question-answering-retrieval
task_ids:
- influence-attribution
- masked-language-modeling
---
# Dataset Card for "FTRACE"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://huggingface.co/datasets/ekinakyurek/ftrace
- **Repository:** https://github.com/ekinakyurek/influence
- **Paper:** https://arxiv.org/pdf/2205.11482.pdf
- **Point of Contact:** [Ekin Akyürek](mailto:akyurek@mit.edu)
- **Size of downloaded dataset files:** 113.7 MB
- **Size of the generated dataset:** 1006.6 MB
- **Total amount of disk used:** 1120.3 MB
### Dataset Summary
[PAPER]
FTRACE is a zero-shot infromation retrieval benchmark deviced for tracing a language model’s predictions back to training examples. In the accompanying paper, we evaluate commonly studied influence methods, including gradient-based (TracIn) and embedding-based approaches. The dataset contains two parts. First, factual queries for that we trace the knowledge are extracted from existing LAMA queries (Petroni et al., 2019). Second, Wikidata sentences are extracted from TREx corpus (Elsahar et al., 2018). We annotate the extracted sentences with their stated facts, and these facts can be mathed with the facts in query set. In both parts, we provide (input, target) pairs as a masked language modeling task -- see examples in the below. However, one can use the same data in other formalities for example auto-regressive completion via a processing of `input_pretokenized` and `targets_pretokenized` field.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### Abstracts
- **Size of downloaded dataset files:** 112 MB
- **Size of the generated dataset:** 884 MB
- **Total amount of disk used:** 996 MB
An example of 'abstract' looks as follows.
```
{"inputs_pretokenized": "The name Austroasiatic comes from the Latin words for \"south\" and \"Asia\", hence \"<extra_id_0>\".",
"targets_pretokenized": "<extra_id_0> South Asia",
"page_uri": "Q33199",
"masked_uri": "Q771405",
"masked_type": "subject",
"example_uris": "Q33199-1-Q48-Q771405-1",
"facts": "P361,Q48,Q771405;P30,Q48,Q771405",
"id": 8}
```
#### Queries
- **Size of downloaded dataset files:** 1.7 MB
- **Size of the generated dataset:** 8.9 MB
- **Total amount of disk used:** 10.6 MB
An example of 'query' looks as follows.
```
{"inputs_pretokenized": "Paul Ehrlich used to work in <extra_id_0> .",
"targets_pretokenized": "<extra_id_0> Frankfurt",
"uuid": "5b063008-a8ba-4064-9f59-e70102bb8c50",
"obj_uri": "Q1794",
"sub_uri": "Q57089",
"predicate_id": "P937",
"obj_surface": "Frankfurt",
"sub_surface": "Paul Ehrlich"}
```
### Data Fields
The data fields are the same among all splits.
#### Abstracts
- `inputs_pretokenized`: a `string` feature.
- `targets_pretokenized`: a `string` feature.
- `masked_uri`: a `string` feature.
- `masked_type`: a `string` feature.
- `facts`: a `string` feature.
- `id`: a `string` feature.
- `example_uris`: a `string` feature.
- `page_uri`: a `string` feature.
#### Queries
- `inputs_pretokenized`: a `string` feature.
- `targets_pretokenized`: a `string` feature.
- `obj_surface`: a `string` feature.
- `sub_surface`: a `string` feature.
- `obj_uri`: a `string` feature.
- `sub_uri`: a `string` feature.
- `predicate_id`: a `string` feature.
- `uuid`: a `string` feature.
### Data Splits
| name | train |
|-----------|------:|
|Abstracts |1560453|
|Queries |31479 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
LAMA: https://github.com/facebookresearch/LAMA
TRex: https://hadyelsahar.github.io/t-rex/
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The parts of this dataset are available under the [Creative Commons Attribution-ShareAlike License (CC BY-SA 4.0)](https://creativecommons.org/licenses/by-sa/4.0/) and [The Creative Commons Attribution-Noncommercial 4.0 International License](https://github.com/facebookresearch/LAMA/blob/master/LICENSE)
### Citation Information
The main paper should be cited as follow:
```
@misc{https://doi.org/10.48550/arxiv.2205.11482,
doi = {10.48550/ARXIV.2205.11482},
url = {https://arxiv.org/abs/2205.11482},
author = {Akyürek, Ekin and Bolukbasi, Tolga and Liu, Frederick and Xiong, Binbin and Tenney, Ian and Andreas, Jacob and Guu, Kelvin},
keywords = {Computation and Language (cs.CL), Information Retrieval (cs.IR), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Tracing Knowledge in Language Models Back to the Training Data},
publisher = {arXiv},
year = {2022},
}
```
Please also cite Petroni et al., 2019 for the query set, and Elsahar et al., 2018 for the abstract set.
```
@inproceedings{petroni2019language,
title={Language Models as Knowledge Bases?},
author={F. Petroni, T. Rockt{\"{a}}schel, A. H. Miller, P. Lewis, A. Bakhtin, Y. Wu and S. Riedel},
booktitle={In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019},
year={2019}
}
```
```
@inproceedings{elsahar2018t,
title={T-rex: A large scale alignment of natural language with knowledge base triples},
author={Elsahar, Hady and Vougiouklis, Pavlos and Remaci, Arslen and Gravier, Christophe and Hare, Jonathon and Laforest, Frederique and Simperl, Elena},
booktitle={Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)},
year={2018}
}
```
### Contributions |
CyberHarem/arashi_chisato_lovelivesuperstar | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of arashi_chisato/嵐千砂都/아라시치사토 (Love Live! Superstar!!)
This is the dataset of arashi_chisato/嵐千砂都/아라시치사토 (Love Live! Superstar!!), containing 500 images and their tags.
The core tags of this character are `bangs, white_hair, hair_bun, double_bun, red_eyes, long_hair, twintails, blunt_bangs, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 673.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/arashi_chisato_lovelivesuperstar/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 328.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/arashi_chisato_lovelivesuperstar/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1227 | 741.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/arashi_chisato_lovelivesuperstar/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 567.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/arashi_chisato_lovelivesuperstar/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1227 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/arashi_chisato_lovelivesuperstar/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/arashi_chisato_lovelivesuperstar',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, collared_shirt, looking_at_viewer, neck_ribbon, pinafore_dress, red_ribbon, short_sleeves, solo, upper_body, white_shirt, yuigaoka_school_uniform, blush, single_sidelock, smile, birthday, ok_sign, pink_background |
| 1 | 10 |  |  |  |  |  | 1girl, blue_jacket, collared_shirt, grey_dress, long_sleeves, looking_at_viewer, neck_ribbon, open_jacket, pinafore_dress, red_ribbon, solo, white_shirt, yuigaoka_school_uniform, open_mouth, white_background, :d, blush, simple_background, upper_body, teeth |
| 2 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, neck_ribbon, red_ribbon, solo, upper_body, yuigaoka_school_uniform, blue_jacket, collared_shirt, portrait, smile, white_shirt, birthday, blush, long_sleeves, shiny_hair, open_mouth |
| 3 | 8 |  |  |  |  |  | 1girl, happy_birthday, looking_at_viewer, solo, character_name, dated, english_text, upper_body, grin, blush, jacket, short_sleeves, signature, single_sidelock |
| 4 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, midriff, navel, solo, collarbone, smile, blush, open_jacket, pink_jacket, open_mouth, crop_top, long_sleeves, small_breasts, black_shorts, off_shoulder, one_eye_closed, pink_background, upper_body, white_tank_top |
| 5 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, tiara, upper_body, white_gloves, blush, earrings, necklace, smile, collarbone, crown, open_mouth, elbow_gloves, pink_dress, puffy_short_sleeves, purple_dress |
| 6 | 5 |  |  |  |  |  | 2girls, looking_at_viewer, smile, solo_focus, holding_hands, orange_hair, boots, mini_hat, pink_dress |
| 7 | 17 |  |  |  |  |  | 1girl, blush, nipples, completely_nude, navel, collarbone, pussy, small_breasts, 1boy, hetero, censored, open_mouth, solo_focus, penis, sex, sweat, closed_eyes, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collared_shirt | looking_at_viewer | neck_ribbon | pinafore_dress | red_ribbon | short_sleeves | solo | upper_body | white_shirt | yuigaoka_school_uniform | blush | single_sidelock | smile | birthday | ok_sign | pink_background | blue_jacket | grey_dress | long_sleeves | open_jacket | open_mouth | white_background | :d | simple_background | teeth | portrait | shiny_hair | happy_birthday | character_name | dated | english_text | grin | jacket | signature | midriff | navel | collarbone | pink_jacket | crop_top | small_breasts | black_shorts | off_shoulder | one_eye_closed | white_tank_top | tiara | white_gloves | earrings | necklace | crown | elbow_gloves | pink_dress | puffy_short_sleeves | purple_dress | 2girls | solo_focus | holding_hands | orange_hair | boots | mini_hat | nipples | completely_nude | pussy | 1boy | hetero | censored | penis | sex | sweat | closed_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:--------------|:-----------------|:-------------|:----------------|:-------|:-------------|:--------------|:--------------------------|:--------|:------------------|:--------|:-----------|:----------|:------------------|:--------------|:-------------|:---------------|:--------------|:-------------|:-------------------|:-----|:--------------------|:--------|:-----------|:-------------|:-----------------|:-----------------|:--------|:---------------|:-------|:---------|:------------|:----------|:--------|:-------------|:--------------|:-----------|:----------------|:---------------|:---------------|:-----------------|:-----------------|:--------|:---------------|:-----------|:-----------|:--------|:---------------|:-------------|:----------------------|:---------------|:---------|:-------------|:----------------|:--------------|:--------|:-----------|:----------|:------------------|:--------|:-------|:---------|:-----------|:--------|:------|:--------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | X | | X | X | X | X | X | | X | X | | | X | | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | | | | X | X | X | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | | X | | | | | X | X | | | X | | X | | | X | | | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | | | X | X | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | | | | | | | | | | |
| 7 | 17 |  |  |  |  |  | X | | X | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X |
|
ID3/comentario_youtube_lorea_sin_input | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4904984
num_examples: 3538
download_size: 1682813
dataset_size: 4904984
---
# Dataset Card for "comentario_youtube_lorea_sin_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shidowake/Doctor-Shotgun_capybara-sharegpt_subset_split_3 | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 9064100.571348244
num_examples: 2001
download_size: 4628176
dataset_size: 9064100.571348244
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/metatree_BNG_mfeat_zernike_ | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 277411860
num_examples: 700535
- name: validation
num_bytes: 118588140
num_examples: 299465
download_size: 476793911
dataset_size: 396000000
---
# Dataset Card for "metatree_BNG_mfeat_zernike_"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_train5000_eval5000_v1_doc_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 581636
num_examples: 5000
- name: train_recite_qa
num_bytes: 3790343
num_examples: 5000
- name: eval_qa
num_bytes: 580393
num_examples: 5000
- name: eval_recite_qa
num_bytes: 3785337
num_examples: 5000
- name: all_docs
num_bytes: 5846467
num_examples: 8964
- name: all_docs_eval
num_bytes: 5845967
num_examples: 8964
- name: train
num_bytes: 6428103
num_examples: 13964
- name: validation
num_bytes: 580393
num_examples: 5000
download_size: 17084473
dataset_size: 27438639
---
# Dataset Card for "lmind_nq_train5000_eval5000_v1_doc_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OCR-Ethiopic/HHD-Ethiopic | ---
license: cc-by-4.0
---
## HHD-Ethiopic Dataset
This dataset, named "HHD-Ethiopic," is designed for ethiopic text-image recognition tasks. It contains a collection of historical handwritten Manuscripts in the Ethiopic script. The dataset is intended to facilitate research and development for Ethiopic text-image recognition.
### Dataset Details/
- __Size__: 79,684 <br>
- __Training Set__: 57,374 <br>
- __Test Set__: HHD-Ethiopic consists of two separate Test sets
- __Test Set I (IID)__: 6,375 images (randomly drawn from the training set)
- __Test Set II (OOD)__: 15,935 images (specifically from manuscripts dated in the 18th century) <br>
- __Validation Set__: 10% of the training set, randomly drawn <br>
- __Number of unique Ethiopic characters__ :306
- __Dataset Formats__:the HHD-Ethiopic dataset is stored in two different formats to accommodate different use cases:
- __Raw Image and Ground-truth Text__: consistes of the original images and their corresponding ground-truth text.
The dataset is structured as raw images (.png) accompanied by a [train CSV file](https://huggingface.co/datasets/OCR-Ethiopic/HHD-Ethiopic/blob/main/train/train_raw/image_text_pairs_train.csv), [test-I CSV file](https://huggingface.co/datasets/OCR-Ethiopic/HHD-Ethiopic/blob/main/test/test_rand/image_text_pairs_test_rand.csv), and [test-II CSV file](https://huggingface.co/datasets/OCR-Ethiopic/HHD-Ethiopic/blob/main/test/test_18th/image_text_pairs_test_18th.csv) that contains the file names of the images and their respective ground-truth text for the training and two test sets respectively.<br>
-__Numpy Format__: in this format, both the images and the ground-truth text are stored in a convenient numpy format. The dataset provides pre-processed numpy arrays that can be directly used for training and testing models.
- __Metadata__(Human Level Performance ): we have also included metadata regarding the human-level performance predicted by individuals for the test sets. This metadata provides insights into the expected performance-level that humans can achieve in historical Ethiopic text-image recognition tasks.
- __Test Set I__ - for test set I, a group of 9 individuals was presented with a random subset of the dataset. They were asked to perform Ethiopic text-image recognition and provide their best efforts to transcribe the handwritten texts. The results were collected and stored in a CSV file, [Test-I-human_performance](https://github.com/bdu-birhanu/HHD-Ethiopic/blob/main/Dataset/human-level-predictions/6375_new_all.csv) included in the dataset.
- __Test Set II__ - Test set II which was prepared exclusively from Ethiopic historical handwritten documents dated in the 18th century. A different group of 4 individuals was given this subset for evaluation. The human-level performance predictions for this set are also stored in a separate CSV file, [Test-II_human_performance](https://github.com/bdu-birhanu/HHD-Ethiopic/blob/main/Dataset/human-level-predictions/15935_new_all.csv)
Please refer to the respective CSV files for detailed information on the human-level performance predictions. Each CSV file contains the necessary metadata, including the image filenames, groind-truth and the corresponding human-generated transcriptions.
If you would like to explore or analyze the human-level performance data further, please refer to the provided CSV files.
#### Citation
If you use the hhd-ethiopic dataset in your research, please consider citing it:
```
@misc {author_2023,
author = { {Anonymous-author},
title = { HHD-Ethiopic:A Historical Handwritten Dataset for Ethiopic OCR with Baseline Models and Human-level Performance (Revision 50c1e04) },
year = 2023,
url = { https://huggingface.co/datasets/OCR-Ethiopic/HHD-Ethiopic },
doi = { 10.57967/hf/0691 },
publisher = { Hugging Face }
}
```
#### License
<a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.
|
rootacess/pie-perf | ---
dataset_info:
features:
- name: user_id
dtype: string
- name: problem_id
dtype: string
- name: language
dtype: string
- name: submission_id_v0
dtype: string
- name: submission_id_v1
dtype: string
- name: cpu_time_v0
dtype: int64
- name: cpu_time_v1
dtype: int64
- name: memory_v0
dtype: int64
- name: memory_v1
dtype: int64
- name: status_v0
dtype: string
- name: status_v1
dtype: string
- name: improvement_frac
dtype: float64
- name: input
dtype: string
- name: target
dtype: string
- name: code_v0_loc
dtype: int64
- name: code_v1_loc
dtype: int64
- name: code_v0_num_chars
dtype: int64
- name: code_v1_num_chars
dtype: int64
- name: code_v0_no_empty_lines
dtype: string
- name: code_v1_no_empty_lines
dtype: string
- name: code_same
dtype: bool
- name: relative_loc_diff_percent
dtype: float64
- name: diff
sequence: string
- name: diff_only_import_comment
dtype: bool
- name: measured_runtime_v0
dtype: float64
- name: measured_runtime_v1
dtype: float64
- name: runtime_lift
dtype: float64
- name: key
sequence: string
splits:
- name: train
num_bytes: 110329743
num_examples: 36857
- name: val
num_bytes: 5942994
num_examples: 1940
- name: test
num_bytes: 2714513
num_examples: 1000
- name: codegen_1shot_test
num_bytes: 3003513
num_examples: 1000
download_size: 56295756
dataset_size: 121990763
---
# Dataset Card for "pie-perf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lorinma/EvolInstruct_zh_GPT3.5 | ---
task_categories:
- conversational
- text-generation
language:
- zh
size_categories:
- 10K<n<100K
---
私以为这并不是一次很成功的尝试。猜测一个主要原因是prompt依然是英文的,只是增加了the locale of the prompt is mainland china.
因为WizardLM系列长期霸榜LLM开源榜,一直很好奇EvolInstruct在英文世界表现出的对于复杂prompt的应对能力。
目前中文没有原生的EvolInstruct,仅有两个翻译版本 [1](https://huggingface.co/datasets/FreedomIntelligence/Evol-Instruct-Chinese-GPT4) [2](https://huggingface.co/datasets/silk-road/Wizard-LM-Chinese-instruct-evol)。
故浅浅尝试复现中文版本。代码参照 [3](https://github.com/h2oai/h2o-wizardlm/blob/main/wizardlm.py)
但无奈接口实在是太贵,且生成的时间很长。所以如果有能够提供GPT-4 API资源的,我很乐意将这个量级撑到50K+并进行公开。
一共有3个文件:
combined_seed_correct.json 是使用的基础种子任务371条,alpaca格式。使用了 [Belle的中文种子任务175条](https://github.com/LianjiaTech/BELLE)。并且参照了 [4](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k) 增加了ShareGPT的数据以更接近真实世界的用法,掺入了 [Wildchat-zh抽样196条](https://huggingface.co/datasets/lorinma/Wildchat_zh_sharegpt_Subsample_20K) ,多轮对话只采用第一个有意义的问答对。
231213_ChineseEvolInstruct_140_gpt-4-1106-preview.json 使用gpt-4-1106-preview,因为太贵且接口不稳定,故只生成了140条。这里犯了一个错误,只使用了instruction而忽略了input,所以evol的基础不完整。接口花费约几百人民币。
231214_ChineseEvolInstruction_11k_3.5-turbo-0613.json 修正了错误,即将instruction和input进行concat,使用3.5-turbo-0613接口生成了共计1.1万个alpaca格式的问答对。接口花费约一千人民币,生成时间约24小时。
|
AdapterOcean/data-standardized_embedded | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 801713359
num_examples: 129062
download_size: 0
dataset_size: 801713359
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathan-roberts1/RSI-CB256 | ---
dataset_info:
features:
- name: label_1
dtype:
class_label:
names:
'0': transportation
'1': other objects
'2': woodland
'3': water area
'4': other land
'5': cultivated land
'6': construction land
- name: label_2
dtype:
class_label:
names:
'0': parking lot
'1': avenue
'2': highway
'3': bridge
'4': marina
'5': crossroads
'6': airport runway
'7': pipeline
'8': town
'9': airplane
'10': forest
'11': mangrove
'12': artificial grassland
'13': river protection forest
'14': shrubwood
'15': sapling
'16': sparse forest
'17': lakeshore
'18': river
'19': stream
'20': coastline
'21': hirst
'22': dam
'23': sea
'24': snow mountain
'25': sandbeach
'26': mountain
'27': desert
'28': dry farm
'29': green farmland
'30': bare land
'31': city building
'32': residents
'33': container
'34': storage room
- name: image
dtype: image
splits:
- name: train
num_bytes: 4901667781.625
num_examples: 24747
download_size: 4198991130
dataset_size: 4901667781.625
license: other
task_categories:
- image-classification
- zero-shot-image-classification
---
# Dataset Card for "RSI-CB256"
## Dataset Description
- **Paper** [Exploring Models and Data for Remote Sensing Image Caption Generation](https://ieeexplore.ieee.org/iel7/36/4358825/08240966.pdf)
-
### Licensing Information
For academic purposes.
## Citation Information
[Exploring Models and Data for Remote Sensing Image Caption Generation](https://ieeexplore.ieee.org/iel7/36/4358825/08240966.pdf)
```
@article{lu2017exploring,
title = {Exploring Models and Data for Remote Sensing Image Caption Generation},
author = {Lu, Xiaoqiang and Wang, Binqiang and Zheng, Xiangtao and Li, Xuelong},
journal = {IEEE Transactions on Geoscience and Remote Sensing},
volume = 56,
number = 4,
pages = {2183--2195},
doi = {10.1109/TGRS.2017.2776321},
year={2018}
}
``` |
lafayette-group/thermal-drone-imagery | ---
license: cc-by-nc-sa-2.0
---
|
Piyush2512/tester | ---
dataset_info:
features:
- name: audio_data
dtype: binary
- name: emotion
dtype:
class_label:
names:
'0': anger
'1': sadness
'2': fear
'3': happy
'4': disgusted
'5': neutral
splits:
- name: train
num_bytes: 605989240
num_examples: 7442
download_size: 605592970
dataset_size: 605989240
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shi3z/Qarasu_Wikipedia_Multiturn | ---
license: apache-2.0
---
Japanese multi-turn conversation data was generated using Qarasu14B based on Wikipedia data. Available for non commercial use(Because Qarasu14B learned from ShareGPT).
# Model
https://huggingface.co/lightblue/qarasu-14B-chat-plus-unleashed
# Dataset
https://huggingface.co/datasets/izumi-lab/wikipedia-ja-20230720
# Developed by
FreeAI Ltd. Tsuginosuke AI Super Computer(A100 80Gx8)
https://www.free-ai.ltd/ |
nblinh63/twitter_dataset_1712695419 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 80991
num_examples: 201
download_size: 38742
dataset_size: 80991
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TRealArthur/AiModelForCovers | ---
license: cc
---
|
gguichard/wsd_myriade_synth_data_multilabel_bloom-1b7 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: float64
splits:
- name: train
num_bytes: 50997821.315798305
num_examples: 96254
- name: test
num_bytes: 2684625.6842016955
num_examples: 5067
download_size: 16265770
dataset_size: 53682447.0
---
# Dataset Card for "wsd_myriade_synth_data_multilabel_bloom-1b7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BByrneLab/RAVQAV2Data | ---
license: mit
task_categories:
- question-answering
language:
- en
tags:
- VQA
- KBVQA
- RAVQA
- Retrieval
---
This is the official release of resources for the RAVQA-V2. This repository contains the pre-extracted features for OK-VQA, and the pre-trained checkpoints for RAVQA-V2 (equipped with Fine-grained Late-interaction Multi-modal Retrieval).
The code can be found on [Github](https://github.com/LinWeizheDragon/Retrieval-Augmented-Visual-Question-Answering/tree/RAVQAv2)
|
Jing24/high-train-all | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 79697844
num_examples: 87599
download_size: 50500826
dataset_size: 79697844
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "high-train-all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/lilith_thedemongirlnextdoor | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Lilith
This is the dataset of Lilith, containing 132 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 132 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 322 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 132 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 132 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 132 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 132 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 132 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 322 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 322 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 322 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ciscak/networks-test1 | ---
license: mit
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ryan7653/Jim | ---
license: wtfpl
---
|
iElexperio/processedMorDataLLMv3NewLabels2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence: int64
- name: image
dtype: image
splits:
- name: train
num_bytes: 8865284.0
num_examples: 70
- name: test
num_bytes: 3461510.0
num_examples: 28
download_size: 0
dataset_size: 12326794.0
---
# Dataset Card for "processedMorDataLLMv3NewLabels2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_aboros98__merlin1.5 | ---
pretty_name: Evaluation run of aboros98/merlin1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aboros98/merlin1.5](https://huggingface.co/aboros98/merlin1.5) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aboros98__merlin1.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T14:32:14.694038](https://huggingface.co/datasets/open-llm-leaderboard/details_aboros98__merlin1.5/blob/main/results_2024-03-15T14-32-14.694038.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5678897197189494,\n\
\ \"acc_stderr\": 0.0339682715364421,\n \"acc_norm\": 0.5694899190488795,\n\
\ \"acc_norm_stderr\": 0.03466672337661833,\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.01645126444006824,\n \"mc2\": 0.48030718490694285,\n\
\ \"mc2_stderr\": 0.015257213020870488\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137993,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.563931487751444,\n\
\ \"acc_stderr\": 0.004948824501355489,\n \"acc_norm\": 0.746265684126668,\n\
\ \"acc_norm_stderr\": 0.004342580277662736\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777628,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777628\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.02507598176760168,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.02507598176760168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806586,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806586\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909878,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909878\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.024985354923102335,\n\
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.024985354923102335\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.017818849564796634,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.017818849564796634\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035293,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035293\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n\
\ \"acc_stderr\": 0.016808322261740463,\n \"acc_norm\": 0.6704980842911877,\n\
\ \"acc_norm_stderr\": 0.016808322261740463\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576277,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28044692737430166,\n\
\ \"acc_stderr\": 0.01502408388332288,\n \"acc_norm\": 0.28044692737430166,\n\
\ \"acc_norm_stderr\": 0.01502408388332288\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.02712511551316685,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.02712511551316685\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n\
\ \"acc_stderr\": 0.012570871032146078,\n \"acc_norm\": 0.41199478487614083,\n\
\ \"acc_norm_stderr\": 0.012570871032146078\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.03027332507734576,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.03027332507734576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02011692534742242,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02011692534742242\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.0301164262965406,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.0301164262965406\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3292533659730722,\n\
\ \"mc1_stderr\": 0.01645126444006824,\n \"mc2\": 0.48030718490694285,\n\
\ \"mc2_stderr\": 0.015257213020870488\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233625\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5026535253980288,\n \
\ \"acc_stderr\": 0.01377229076885817\n }\n}\n```"
repo_url: https://huggingface.co/aboros98/merlin1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|arc:challenge|25_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|gsm8k|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hellaswag|10_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-32-14.694038.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T14-32-14.694038.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- '**/details_harness|winogrande|5_2024-03-15T14-32-14.694038.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T14-32-14.694038.parquet'
- config_name: results
data_files:
- split: 2024_03_15T14_32_14.694038
path:
- results_2024-03-15T14-32-14.694038.parquet
- split: latest
path:
- results_2024-03-15T14-32-14.694038.parquet
---
# Dataset Card for Evaluation run of aboros98/merlin1.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aboros98/merlin1.5](https://huggingface.co/aboros98/merlin1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aboros98__merlin1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T14:32:14.694038](https://huggingface.co/datasets/open-llm-leaderboard/details_aboros98__merlin1.5/blob/main/results_2024-03-15T14-32-14.694038.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5678897197189494,
"acc_stderr": 0.0339682715364421,
"acc_norm": 0.5694899190488795,
"acc_norm_stderr": 0.03466672337661833,
"mc1": 0.3292533659730722,
"mc1_stderr": 0.01645126444006824,
"mc2": 0.48030718490694285,
"mc2_stderr": 0.015257213020870488
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137993,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.563931487751444,
"acc_stderr": 0.004948824501355489,
"acc_norm": 0.746265684126668,
"acc_norm_stderr": 0.004342580277662736
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777628,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777628
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.02507598176760168,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.02507598176760168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806586,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806586
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909878,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909878
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.024985354923102335,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.024985354923102335
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.017818849564796634,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.017818849564796634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035293,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035293
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.016808322261740463,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.016808322261740463
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.026033890613576277,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.026033890613576277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28044692737430166,
"acc_stderr": 0.01502408388332288,
"acc_norm": 0.28044692737430166,
"acc_norm_stderr": 0.01502408388332288
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751468,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751468
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.02712511551316685,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.02712511551316685
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.012570871032146078,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.012570871032146078
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.03027332507734576,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.03027332507734576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02011692534742242,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02011692534742242
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.0301164262965406,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.0301164262965406
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3292533659730722,
"mc1_stderr": 0.01645126444006824,
"mc2": 0.48030718490694285,
"mc2_stderr": 0.015257213020870488
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233625
},
"harness|gsm8k|5": {
"acc": 0.5026535253980288,
"acc_stderr": 0.01377229076885817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
snips_built_in_intents | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc0-1.0
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- intent-classification
paperswithcode_id: snips
pretty_name: SNIPS Natural Language Understanding benchmark
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': ComparePlaces
'1': RequestRide
'2': GetWeather
'3': SearchPlace
'4': GetPlaceDetails
'5': ShareCurrentLocation
'6': GetTrafficInformation
'7': BookRestaurant
'8': GetDirections
'9': ShareETA
splits:
- name: train
num_bytes: 19431
num_examples: 328
download_size: 9130264
dataset_size: 19431
train-eval-index:
- config: default
task: text-classification
task_id: multi_class_classification
train_split: train
col_mapping:
text: text
label: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
# Dataset Card for Snips Built In Intents
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/sonos/nlu-benchmark/tree/master/2016-12-built-in-intents
- **Repository:** https://github.com/sonos/nlu-benchmark/tree/master/2016-12-built-in-intents
- **Paper:** https://arxiv.org/abs/1805.10190
- **Point of Contact:** The Snips team has joined Sonos in November 2019. These open datasets remain available and their access is now managed by the Sonos Voice Experience Team. Please email sve-research@sonos.com with any question.
### Dataset Summary
Snips' built in intents dataset was initially used to compare different voice assistants and released as a public dataset hosted at
https://github.com/sonos/nlu-benchmark in folder 2016-12-built-in-intents. The dataset contains 328 utterances over 10 intent classes.
A related Medium post is https://medium.com/snips-ai/benchmarking-natural-language-understanding-systems-d35be6ce568d.
### Supported Tasks and Leaderboards
There are no related shared tasks that we are aware of.
### Languages
English
## Dataset Structure
### Data Instances
The dataset contains 328 utterances over 10 intent classes. Each sample looks like:
`{'label': 8, 'text': 'Transit directions to Barcelona Pizza.'}`
### Data Fields
- `text`: The text utterance expressing some user intent.
- `label`: The intent label of the piece of text utterance.
### Data Splits
The source data is not split.
## Dataset Creation
### Curation Rationale
The dataset was originally created to compare the performance of a number of voice assistants. However, the labelled utterances are useful
for developing and benchmarking text chatbots as well.
### Source Data
#### Initial Data Collection and Normalization
It is not clear how the data was collected. From the Medium post: `The benchmark relies on a set of 328 queries built by the business team
at Snips, and kept secret from data scientists and engineers throughout the development of the solution.`
#### Who are the source language producers?
Originally prepared by snips.ai. The Snips team has since joined Sonos in November 2019. These open datasets remain available and their
access is now managed by the Sonos Voice Experience Team. Please email sve-research@sonos.com with any question.
### Annotations
#### Annotation process
It is not clear how the data was collected. From the Medium post: `The benchmark relies on a set of 328 queries built by the business team
at Snips, and kept secret from data scientists and engineers throughout the development of the solution.`
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Originally prepared by snips.ai. The Snips team has since joined Sonos in November 2019. These open datasets remain available and their
access is now managed by the Sonos Voice Experience Team. Please email sve-research@sonos.com with any question.
### Licensing Information
The source data is licensed under Creative Commons Zero v1.0 Universal.
### Citation Information
Any publication based on these datasets must include a full citation to the following paper in which the results were published by the Snips Team:
Coucke A. et al., "Snips Voice Platform: an embedded Spoken Language Understanding system for private-by-design voice interfaces." CoRR 2018,
https://arxiv.org/abs/1805.10190
### Contributions
Thanks to [@bduvenhage](https://github.com/bduvenhage) for adding this dataset. |
callmezombie/holoart | ---
license: creativeml-openrail-m
language:
- en
tags:
- not-for-all-audiences
- art
--- |
bigbio/mlee |
---
language:
- en
bigbio_language:
- English
license: cc-by-nc-sa-3.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_NC_SA_3p0
pretty_name: MLEE
homepage: http://www.nactem.ac.uk/MLEE/
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- EVENT_EXTRACTION
- NAMED_ENTITY_RECOGNITION
- RELATION_EXTRACTION
- COREFERENCE_RESOLUTION
---
# Dataset Card for MLEE
## Dataset Description
- **Homepage:** http://www.nactem.ac.uk/MLEE/
- **Pubmed:** True
- **Public:** True
- **Tasks:** EE,NER,RE,COREF
MLEE is an event extraction corpus consisting of manually annotated abstracts of papers
on angiogenesis. It contains annotations for entities, relations, events and coreferences
The annotations span molecular, cellular, tissue, and organ-level processes.
## Citation Information
```
@article{pyysalo2012event,
title={Event extraction across multiple levels of biological organization},
author={Pyysalo, Sampo and Ohta, Tomoko and Miwa, Makoto and Cho, Han-Cheol and Tsujii, Jun'ichi and Ananiadou, Sophia},
journal={Bioinformatics},
volume={28},
number={18},
pages={i575--i581},
year={2012},
publisher={Oxford University Press}
}
```
|
open-llm-leaderboard/details_ichigoberry__pandafish-7b | ---
pretty_name: Evaluation run of ichigoberry/pandafish-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ichigoberry/pandafish-7b](https://huggingface.co/ichigoberry/pandafish-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ichigoberry__pandafish-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T02:05:36.646558](https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__pandafish-7b/blob/main/results_2024-04-03T02-05-36.646558.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.650984521894478,\n\
\ \"acc_stderr\": 0.032097685717085274,\n \"acc_norm\": 0.6534019201600828,\n\
\ \"acc_norm_stderr\": 0.03274252873518444,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5268732541669081,\n\
\ \"mc2_stderr\": 0.014927268430500533\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910476,\n\
\ \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6593308105954989,\n\
\ \"acc_stderr\": 0.004729656826803945,\n \"acc_norm\": 0.8528181637124079,\n\
\ \"acc_norm_stderr\": 0.003535630289091459\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.02537952491077841,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.02537952491077841\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572223,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572223\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.01612554382355295,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.01612554382355295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165854,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165854\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422466,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422466\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6895424836601307,\n \"acc_stderr\": 0.01871806705262323,\n \
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.01871806705262323\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139967,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139967\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5268732541669081,\n\
\ \"mc2_stderr\": 0.014927268430500533\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5830174374526156,\n \
\ \"acc_stderr\": 0.013581320997216586\n }\n}\n```"
repo_url: https://huggingface.co/ichigoberry/pandafish-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-05-36.646558.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T02-05-36.646558.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- '**/details_harness|winogrande|5_2024-04-03T02-05-36.646558.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T02-05-36.646558.parquet'
- config_name: results
data_files:
- split: 2024_04_03T02_05_36.646558
path:
- results_2024-04-03T02-05-36.646558.parquet
- split: latest
path:
- results_2024-04-03T02-05-36.646558.parquet
---
# Dataset Card for Evaluation run of ichigoberry/pandafish-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ichigoberry/pandafish-7b](https://huggingface.co/ichigoberry/pandafish-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ichigoberry__pandafish-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T02:05:36.646558](https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__pandafish-7b/blob/main/results_2024-04-03T02-05-36.646558.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.650984521894478,
"acc_stderr": 0.032097685717085274,
"acc_norm": 0.6534019201600828,
"acc_norm_stderr": 0.03274252873518444,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5268732541669081,
"mc2_stderr": 0.014927268430500533
},
"harness|arc:challenge|25": {
"acc": 0.613481228668942,
"acc_stderr": 0.014230084761910476,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179344
},
"harness|hellaswag|10": {
"acc": 0.6593308105954989,
"acc_stderr": 0.004729656826803945,
"acc_norm": 0.8528181637124079,
"acc_norm_stderr": 0.003535630289091459
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.02537952491077841,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.02537952491077841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572223,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572223
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922526,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.01612554382355295,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.01612554382355295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.02378858355165854,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.02378858355165854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422466,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422466
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.01871806705262323,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.01871806705262323
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139967,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139967
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5268732541669081,
"mc2_stderr": 0.014927268430500533
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.5830174374526156,
"acc_stderr": 0.013581320997216586
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
IntelLabs/WAT-WorldAcrossTime | ---
license: cc-by-nc-4.0
pretty_name: World Across Time
---
The World Across Time (WAT) dataset used in paper "CLNeRF: Continual Learning Meets NeRF". It contains multiple colmap reconstructed scenes used for continual learning of NeRFs. For each scene, we provide multiple scans captured at different time where the same scene has different appearance and geometry conditions. Please refer to our video (https://youtu.be/nLRt6OoDGq0) or github repo (https://github.com/IntelLabs/CLNeRF) for further details and how to use this dataset.
This content is provided in support of our paper, CLNeRF: Continual Learning Meets NeRF, accepted by the IEEE/CVF International Conference on Computer Vision (ICCV) 2023. This content is provided here for research purposes only and the dataset(s) used is licensed under CC BY-NC 4.0. By accessing the dataset(s), you agree to the terms associated with those datasets and that your use complies with the applicable license. Any use beyond this is your sole responsibility and subject to your securing the necessary rights for your purpose.
Intel is not liable for any errors, omissions, or defects in the data, or for any reliance on the data. |
AdapterOcean/Open_Platypus_standardized_cluster_0_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7794914
num_examples: 10635
download_size: 0
dataset_size: 7794914
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_0_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PROCESOS/id_anversoAntiguo | ---
license: c-uda
---
|
adhitya123/Gita1gpt | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: context
dtype: string
- name: title
dtype: string
splits:
- name: validation
num_bytes: 4575
num_examples: 15
download_size: 6694
dataset_size: 4575
---
# Dataset Card for "Gita1gpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lchakkei/OpenOrca-Traditional-Chinese-LLama2-Format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6745713021
num_examples: 4233915
download_size: 3983934887
dataset_size: 6745713021
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_72 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1335023084
num_examples: 260137
download_size: 1362779374
dataset_size: 1335023084
---
# Dataset Card for "chunk_72"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DennisR96/Lisa | ---
license: mit
---
|
Shreyam/sentiment-labeled-3-matrix | ---
license: mit
task_categories:
- text-classification
pretty_name: Labeled Sentiment 3 Matrix Dataset
size_categories:
- 100K<n<1M
--- |
satwikapaul/test_braille | ---
license: openrail
---
|
faizalnf1800/karambit-knife-object | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 791480.0
num_examples: 27
download_size: 767049
dataset_size: 791480.0
---
Still Collecting Dataset. Karambit Knife Object Raw Picture Google Drive [Link](https://drive.google.com/file/d/1fFRSxeTt9Tvj6d7PFdJWJ56LXR5shoH4/view?usp=share_link) |
ajibawa-2023/Julia-Proof-Pile-2 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 100M<n<1B
tags:
- code
---
**Julia-Proof-Pile-2**
This dataset is part of Proof-Pile-2 dataset. This dataset is consisting of mathematical code, including numerical computing, computer algebra, and formal mathematics.
This entire dataset is in Julia language. It is slightly more than 0.5 Billion tokens. I have removed Meta data from this dataset hence you can directly use it for training purpose.
This dataset is in Jsonl format.
|
Spico/ChCatExt | ---
license: apache-2.0
language:
- zh
tags:
- finance
--- |
Juanchoxs/model1 | ---
license: openrail
---
|
DFKI-SLT/gids | ---
annotations_creators:
- other
language:
- en
language_creators:
- found
license:
- other
multilinguality:
- monolingual
pretty_name: Google-IISc Distant Supervision (GIDS) dataset for distantly-supervised
relation extraction
size_categories:
- 10K<n<100k
source_datasets:
- extended|other
tags:
- relation extraction
task_categories:
- text-classification
task_ids:
- multi-class-classification
dataset_info:
- config_name: gids
features:
- name: sentence
dtype: string
- name: subj_id
dtype: string
- name: obj_id
dtype: string
- name: subj_text
dtype: string
- name: obj_text
dtype: string
- name: relation
dtype:
class_label:
names:
'0': NA
'1': /people/person/education./education/education/institution
'2': /people/person/education./education/education/degree
'3': /people/person/place_of_birth
'4': /people/deceased_person/place_of_death
splits:
- name: train
num_bytes: 5088421
num_examples: 11297
- name: validation
num_bytes: 844784
num_examples: 1864
- name: test
num_bytes: 2568673
num_examples: 5663
download_size: 8941490
dataset_size: 8501878
- config_name: gids_formatted
features:
- name: token
sequence: string
- name: subj_start
dtype: int32
- name: subj_end
dtype: int32
- name: obj_start
dtype: int32
- name: obj_end
dtype: int32
- name: relation
dtype:
class_label:
names:
'0': NA
'1': /people/person/education./education/education/institution
'2': /people/person/education./education/education/degree
'3': /people/person/place_of_birth
'4': /people/deceased_person/place_of_death
splits:
- name: train
num_bytes: 7075362
num_examples: 11297
- name: validation
num_bytes: 1173957
num_examples: 1864
- name: test
num_bytes: 3573706
num_examples: 5663
download_size: 8941490
dataset_size: 11823025
---
# Dataset Card for "gids"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Repository:** [RE-DS-Word-Attention-Models](https://github.com/SharmisthaJat/RE-DS-Word-Attention-Models/tree/master/Data/GIDS)
- **Paper:** [Improving Distantly Supervised Relation Extraction using Word and Entity Based Attention](https://arxiv.org/abs/1804.06987)
- **Size of downloaded dataset files:** 8.94 MB
- **Size of the generated dataset:** 11.82 MB
### Dataset Summary
The Google-IISc Distant Supervision (GIDS) is a new dataset for distantly-supervised relation extraction.
GIDS is seeded from the human-judged Google relation extraction corpus.
See the paper for full details: [Improving Distantly Supervised Relation Extraction using Word and Entity Based Attention](https://arxiv.org/abs/1804.06987)
Note:
- There is a formatted version that you can load with `datasets.load_dataset('gids', name='gids_formatted')`. This version is tokenized with spaCy, removes the underscores in the entities and provides entity offsets.
### Supported Tasks and Leaderboards
- **Tasks:** Relation Classification
- **Leaderboards:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
The language in the dataset is English.
## Dataset Structure
### Data Instances
#### gids
- **Size of downloaded dataset files:** 8.94 MB
- **Size of the generated dataset:** 8.5 MB
An example of 'train' looks as follows:
```json
{
"sentence": "War as appropriate. Private Alfred James_Smurthwaite Sample. 26614. 2nd Battalion Yorkshire Regiment. Son of Edward James Sample, of North_Ormesby , Yorks. Died 2 April 1917. Aged 29. Born Ormesby, Enlisted Middlesbrough. Buried BUCQUOY ROAD CEMETERY, FICHEUX. Not listed on the Middlesbrough War Memorial Private Frederick Scott. 46449. 4th Battalion Yorkshire Regiment. Son of William and Maria Scott, of 25, Aspinall St., Heywood, Lancs. Born at West Hartlepool. Died 27 May 1918. Aged 24.",
"subj_id": "/m/02qt0sv",
"obj_id": "/m/0fnhl9",
"subj_text": "James_Smurthwaite",
"obj_text": "North_Ormesby",
"relation": 4
}
```
#### gids_formatted
- **Size of downloaded dataset files:** 8.94 MB
- **Size of the generated dataset:** 11.82 MB
An example of 'train' looks as follows:
```json
{
"token": ["announced", "he", "had", "closed", "shop", ".", "Mary", "D.", "Crisp", "Coyle", "opened", "in", "1951", ".", "Stoffey", ",", "a", "Maricopa", "County", "/", "Phoenix", "city", "resident", "and", "longtime", "customer", ",", "bought", "the", "business", "in", "2011", ",", "when", "then", "owners", "were", "facing", "closure", ".", "He", "renovated", "the", "diner", "is", "interior", ",", "increased", "training", "for", "staff", "and", "expanded", "the", "menu", "."],
"subj_start": 6,
"subj_end": 9,
"obj_start": 17,
"obj_end": 22,
"relation": 4
}
```
### Data Fields
The data fields are the same among all splits.
#### gids
- `sentence`: the sentence, a `string` feature.
- `subj_id`: the id of the relation subject mention, a `string` feature.
- `obj_id`: the id of the relation object mention, a `string` feature.
- `subj_text`: the text of the relation subject mention, a `string` feature.
- `obj_text`: the text of the relation object mention, a `string` feature.
- `relation`: the relation label of this instance, an `int` classification label.
```python
{"NA": 0, "/people/person/education./education/education/institution": 1, "/people/person/education./education/education/degree": 2, "/people/person/place_of_birth": 3, "/people/deceased_person/place_of_death": 4}
```
#### gids_formatted
- `token`: the list of tokens of this sentence, obtained with spaCy, a `list` of `string` features.
- `subj_start`: the 0-based index of the start token of the relation subject mention, an `ìnt` feature.
- `subj_end`: the 0-based index of the end token of the relation subject mention, exclusive, an `ìnt` feature.
- `obj_start`: the 0-based index of the start token of the relation object mention, an `ìnt` feature.
- `obj_end`: the 0-based index of the end token of the relation object mention, exclusive, an `ìnt` feature.
- `relation`: the relation label of this instance, an `int` classification label.
```python
{"NA": 0, "/people/person/education./education/education/institution": 1, "/people/person/education./education/education/degree": 2, "/people/person/place_of_birth": 3, "/people/deceased_person/place_of_death": 4}
```
### Data Splits
| | Train | Dev | Test |
|------|-------|------|------|
| GIDS | 11297 | 1864 | 5663 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{DBLP:journals/corr/abs-1804-06987,
author = {Sharmistha Jat and
Siddhesh Khandelwal and
Partha P. Talukdar},
title = {Improving Distantly Supervised Relation Extraction using Word and
Entity Based Attention},
journal = {CoRR},
volume = {abs/1804.06987},
year = {2018},
url = {http://arxiv.org/abs/1804.06987},
eprinttype = {arXiv},
eprint = {1804.06987},
timestamp = {Fri, 15 Nov 2019 17:16:02 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1804-06987.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@phucdev](https://github.com/phucdev) for adding this dataset. |
Saxo/ko_government_qa_total_linkbricks_single_dataset_with_prompt_text_huggingface_sampled | ---
license: apache-2.0
---
|
BhabhaAI/indic-instruct-data-v0.2-filtered | ---
language:
- en
- hi
multilinguality:
- multilingual
size_categories:
- 5K<n<400K
language_bcp47:
- en-US
- hi-IN
configs:
- config_name: anudesh
data_files:
- split: en
path: anudesh/en*
- split: hi
path: anudesh/hi*
- config_name: dolly
data_files:
- split: en
path: dolly/en*
- split: hi
path: dolly/hi*
- config_name: flan_v2
data_files:
- split: en
path: flan_v2/en*
- split: hi
path: flan_v2/hi*
- config_name: hh-rlhf
data_files:
- split: en
path: hh-rlhf/en*
- split: hi
path: hh-rlhf/hi*
- config_name: nmt-seed
data_files:
- split: hi
path: nmt-seed/hi*
- config_name: wikihow
data_files:
- split: en
path: wikihow/en*
- split: hi
path: wikihow/hi*
- config_name: oasst1
data_files:
- split: en
path: oasst1/en*
- split: hi
path: oasst1/hi*
- config_name: lm_sys
data_files:
- split: en
path: lm_sys/en*
- split: hi
path: lm_sys/hi*
---
This is v0.2 of [indic-instruct-data-v0.1-filtered](https://huggingface.co/datasets/BhabhaAI/indic-instruct-data-v0.1-filtered)
**Note**: lmsys dataset contain NAME_1, NAME_2 etc. You may replace them with actual names before fine-tuning. |
NatLee/sentiment-classification-dataset-bundle | ---
task_categories:
- text-classification
language:
- en
size_categories:
- 100K<n<1M
---
# NLP: Sentiment Classification Dataset
This is a bundle dataset for a NLP task of sentiment classification in English.
There is a sample project is using this dataset [GURA-gru-unit-for-recognizing-affect](https://github.com/NatLee/GURA-gru-unit-for-recognizing-affect).
## Content
- `myanimelist-sts`: This dataset is derived from MyAnimeList, a social networking and cataloging service for anime and manga fans. The dataset typically includes user reviews with ratings. We used [skip-thoughts](https://pypi.org/project/skip-thoughts/) to summarize them. You can find the original source of the dataset [myanimelist-comment-dataset](https://www.kaggle.com/datasets/natlee/myanimelist-comment-dataset) and the version is `2023-05-11`.
- `aclImdb`: The ACL IMDB dataset is a large movie review dataset collected for sentiment analysis tasks. It contains 50,000 highly polar movie reviews, divided evenly into 25,000 training and 25,000 test sets. Each set includes an equal number of positive and negative reviews. The source is from [sentiment](https://ai.stanford.edu/~amaas/data/sentiment/)
- `MR`: Movie Review Data (MR) is a dataset that contains 5,331 positive and 5,331 negative processed sentences/lines. This dataset is suitable for binary sentiment classification tasks, and it's a good starting point for text classification models. You can find the source [movie-review-data](http://www.cs.cornell.edu/people/pabo/movie-review-data/) and the section is `Sentiment scale datasets`.
- `MPQA`: The Multi-Perspective Question Answering (MPQA) dataset is a resource for opinion detection and sentiment analysis research. It consists of news articles from a wide variety of sources annotated for opinions and other private states. You can get the source from [MPQA](https://mpqa.cs.pitt.edu/)
- `SST2`: The Stanford Sentiment Treebank version 2 (SST2) is a popular benchmark for sentence-level sentiment analysis. It includes movie review sentences with corresponding sentiment labels (positive or negative). You can obtain the dataset from [SST2](https://huggingface.co/datasets/sst2)
- `SUBJ`: The Subjectivity dataset is used for sentiment analysis research. It consists of 5000 subjective and 5000 objective processed sentences, which can help a model to distinguish between subjective and objective (factual) statements. You can find the source [movie-review-data](http://www.cs.cornell.edu/people/pabo/movie-review-data/) and the section is `Subjectivity datasets`.
# Tokenizer
```python
from pathlib import Path
import pickle
from tensorflow.keras.preprocessing.text import Tokenizer
def check_data_path(file_path:str) -> bool:
if Path(file_path).exists():
print(f'[Path][OK] {file_path}')
return True
print(f'[Path][FAILED] {file_path}')
return False
sentences = []
# =====================
# Anime Reviews
# =====================
dataset = './myanimelist-sts.pkl'
if check_data_path(dataset):
with open(dataset, 'rb') as p:
X, Y = pickle.load(p)
sentences.extend(X)
sentences.extend(Y)
# =====================
# MPQA
# =====================
dataset = './MPQA.pkl'
if check_data_path(dataset):
with open(dataset, 'rb') as p:
mpqa = pickle.load(p)
sentences.extend(list(mpqa.sentence))
# =====================
# IMDB
# =====================
dataset = './aclImdb.pkl'
if check_data_path(dataset):
with open(dataset, 'rb') as p:
x_test, y_test, x_train, y_train = pickle.load(p)
sentences.extend(x_train)
sentences.extend(y_train)
# =====================
# MR
# =====================
dataset = './MR.pkl'
if check_data_path(dataset):
with open(dataset, 'rb') as p:
mr = pickle.load(p)
sentences.extend(list(mr.sentence))
# =====================
# SST2
# =====================
dataset = './SST2.pkl'
if check_data_path(dataset):
with open(dataset, 'rb') as p:
sst2 = pickle.load(p)
sentences.extend(list(sst2.sentence))
# =====================
# SUBJ
# =====================
dataset = './SUBJ.pkl'
if check_data_path(dataset):
with open(dataset, 'rb') as p:
subj = pickle.load(p)
sentences.extend(list(subj.sentence))
sentences = map(str, sentences)
#Tokenize the sentences
myTokenizer = Tokenizer(
num_words = 100,
oov_token="{OOV}"
)
myTokenizer.fit_on_texts(sentences)
print(myTokenizer.word_index)
with open('./big-tokenizer.pkl', 'wb') as p:
pickle.dump(myTokenizer, p)
```
|
nielsr/datacomp-small-with-embeddings-and-cluster-labels | ---
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: clip_b32_similarity_score
dtype: float32
- name: clip_l14_similarity_score
dtype: float32
- name: face_bboxes
sequence:
sequence: float64
- name: sha256
dtype: string
- name: clip_l14_embedding
sequence: float64
- name: cluster_label
dtype: int64
splits:
- name: train
num_bytes: 82751789578
num_examples: 12800000
download_size: 23194559015
dataset_size: 82751789578
---
# Dataset Card for "datacomp-small-with-embeddings-and-cluster-labels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.