datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
james-burton/imdb_genre_prediction_ordinal | ---
dataset_info:
features:
- name: Rank
dtype: int64
- name: Title
dtype: string
- name: Description
dtype: string
- name: Director
dtype: string
- name: Actors
dtype: string
- name: Year
dtype: int64
- name: Runtime (Minutes)
dtype: int64
- name: Rating
dtype: float64
- name: Votes
dtype: int64
- name: Revenue (Millions)
dtype: float64
- name: Metascore
dtype: float64
- name: Genre_is_Drama
dtype: int64
splits:
- name: train
num_bytes: 224587
num_examples: 680
- name: validation
num_bytes: 39612
num_examples: 120
- name: test
num_bytes: 65442
num_examples: 200
download_size: 0
dataset_size: 329641
---
# Dataset Card for "imdb_genre_prediction_ordinal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ConiferLM/Conifer | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 64628977
num_examples: 13606
download_size: 31032122
dataset_size: 64628977
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for Conifer
[GitHub](https://github.com/ConiferLM/Conifer) | [Paper](https://arxiv.org/abs/2404.02823)
Conifer is an open-sourced dataset aiming to improve the instruction-following ability of large language models (LLM).
We recommend integrating Conifer with additional SFT datasets such as ShareGPT or Deita to enhance overall performance.
## Performance
Supervised Fine-tuned (SFT) Models
| - | Final Stage | IFEval | FollowBench Avg | FollowBench Hard (L4-L5) | InFoBench | AlpacaEval LC Win Rate | MT-Bench |
| ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- |
| Deita-7B-v1.0-SFT | SFT | 45.1 | 42.0 | 31.6 | 78.6 | - | 7.22 |
| Evol-Instruct-7B-SFT | SFT | 44.0 | 40.7 | 27.6 | 75.6 | 9.4% | 6.51 |
| ShareGPT-7B-SFT | SFT | 43.3 | 42.9 | 32.3 | 78.5 | 11.6% | 6.86 |
| Conifer-7B-SFT |SFT | 50.8 | 44.9 | 35.7 | 79.5 | 12.5% | 7.08 |
DPO/RLHF Models
| - | Final Stage | IFEval | FollowBench Avg | FollowBench Hard (L4-L5) | InFoBench | AlpacaEval LC Win Rate | MT-Bench |
| ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- |
| LLaMA-2-70B-Chat | RLHF | - | 47.5 | 39.0 | 84.4 | 14.7% | 6.86 |
| Zephyr-7B-beta | DPO | 44.9 | 44.8 | 36.4 | 78.0 | 13.2% | 7.34 |
| Deita-7B-v1.0 | DPO | 51.9 | 45.7 | 38.5 | 80.9 | 16.1% | 7.55 |
| ShareGPT-7B-DPO | DPO| 48.2 | 47.7 | 38.9 | 82.0 | 15.1% | 7.10 |
| Conifer-7B-DPO |DPO| 52.3 | 50.0 | 44.1 | 82.3 | 17.1% | 7.25 |
## Citation
If you find the content of this project helpful, please cite our paper as follows:
```bibtex
@article{
coniferlm,
title={Conifer: Improving Complex Constrained Instruction-Following Ability of Large Language Models},
author={Haoran Sun and Lixin Liu and Junjie Li and Fengyu Wang and Baohua Dong and Ran Lin and Ruohui Huang},
journal={arxiv preprint arXiv:2404.02823},
year={2024},
url={https://arxiv.org/abs/2404.02823}
}
``` |
deepak-newzera/spectrogram_data_Upbeat-4s | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 105472108.125
num_examples: 3495
download_size: 104843147
dataset_size: 105472108.125
---
# Dataset Card for "spectrogram_data_Upbeat-4s"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gabriel1322/taspio | ---
license: openrail
---
|
Hack90/ncbi_genbank_part_39 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 31553866013
num_examples: 1218
download_size: 14299220624
dataset_size: 31553866013
---
# Dataset Card for "ncbi_genbank_part_39"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joaosanches/tedtalks_dataset_not_in_train | ---
dataset_info:
features:
- name: pt
dtype: string
- name: pt-br
dtype: string
splits:
- name: train
num_bytes: 39396315
num_examples: 187718
download_size: 25225794
dataset_size: 39396315
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
unpredictable/unpredictable_full | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-full
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-full" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Repository:** https://github.com/AnonCodeShare/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/unpredictable/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/unpredictable/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/unpredictable/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/unpredictable/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/unpredictable/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/unpredictable/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/unpredictable/unpredictable_support-google-com)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Licensing Information
Apache 2.0 |
Mike36Theone/GaiofatoFinal | ---
license: openrail
---
|
shreevigneshs/iwslt-2023-en-ko-train-val-split-0.1 | ---
dataset_info:
features:
- name: en
dtype: string
- name: ko
dtype: string
- name: ko_annotated
dtype: string
- name: styles
dtype: int64
splits:
- name: train
num_bytes: 283232.0
num_examples: 720
- name: val
num_bytes: 32220.0
num_examples: 80
- name: if_test
num_bytes: 238485.0
num_examples: 597
- name: f_test
num_bytes: 249702.0
num_examples: 597
- name: f_flores
num_bytes: 312159
num_examples: 1012
- name: if_flores
num_bytes: 312159
num_examples: 1012
download_size: 702238
dataset_size: 1427957.0
---
# Dataset Card for "iwslt-2023-en-ko-train-val-split-0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_22 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20243076480.0
num_examples: 210760
download_size: 17915722749
dataset_size: 20243076480.0
---
# Dataset Card for "chunk_22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ofutun/dependencies | ---
license: unknown
---
|
Valmy/Hackers_Face_Detection_Image | ---
license: other
---
|
zolak/twitter_dataset_79_1713096783 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2799579
num_examples: 6968
download_size: 1407366
dataset_size: 2799579
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vikp/codem_filtered | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: kind
dtype: string
- name: quality_prob
dtype: float64
- name: learning_prob
dtype: float64
splits:
- name: train
num_bytes: 49267861.09607679
num_examples: 31046
download_size: 21584553
dataset_size: 49267861.09607679
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "codem_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vsrirama/test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1424125.0
num_examples: 39
- name: validation
num_bytes: 577591.0
num_examples: 16
download_size: 0
dataset_size: 2001716.0
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unigram/fol-03b | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: proof
dtype: string
- name: premise_tptp
dtype: string
- name: hypothesis_tptp
dtype: string
- name: deberta_pred
dtype: string
- name: deberta_pred_r1_label
dtype: string
- name: deberta_pred_r2_label
dtype: string
- name: deberta_pred_r3_label
dtype: string
splits:
- name: train
num_bytes: 11318974
num_examples: 1506
- name: validation
num_bytes: 1847876
num_examples: 255
- name: test
num_bytes: 1772318
num_examples: 228
download_size: 2618364
dataset_size: 14939168
---
# Dataset Card for "fol-03b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AntoineBlanot/coqa-questions-answers | ---
dataset_info:
features:
- name: text
dtype: string
- name: label_name
dtype: string
splits:
- name: train
num_bytes: 9184688
num_examples: 217294
- name: validation
num_bytes: 665654
num_examples: 15966
download_size: 4173007
dataset_size: 9850342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
kothasuhas/QuRatedPajama_c4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: writing_style_average
dtype: float64
- name: facts_and_trivia_average
dtype: float64
- name: educational_value_average
dtype: float64
- name: required_expertise_average
dtype: float64
- name: writing_style_chunks
sequence: float64
- name: facts_and_trivia_chunks
sequence: float64
- name: educational_value_chunks
sequence: float64
- name: required_expertise_chunks
sequence: float64
- name: length
dtype: int64
- name: chunk_lengths
sequence: int64
- name: input_ids
sequence: int32
- name: document_index
dtype: int64
- name: document_position
dtype: int64
- name: source_domain
dtype: string
- name: cluster_id
dtype: int64
- name: cluster_no
dtype: int64
splits:
- name: train
num_bytes: 1331409138
num_examples: 159673
download_size: 707377848
dataset_size: 1331409138
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is a subset of the [QuRate dataset](https://huggingface.co/datasets/princeton-nlp/QuRatedPajama-1B_tokens_for_analysis) filtered for C4 data only, used in a quick data filtering/curriculum challenge. |
Sunbird/m2e_6_4_padded_no_tags_no_augs | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4401362036
num_examples: 2626111
- name: valid
num_bytes: 4190000
num_examples: 2500
download_size: 384950492
dataset_size: 4405552036
---
# Dataset Card for "m2e_6_4_padded_no_tags_no_augs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/ENN_masking_embeddings_dim_2 | ---
dataset_info:
features:
- name: last_hs
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1345440
num_examples: 67272
download_size: 750654
dataset_size: 1345440
---
# Dataset Card for "ENN_masking_embeddings_dim_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hari560/Mistral_AI_Medical_Dataset | ---
task_categories:
- text-generation
language:
- en
tags:
- medical
--- |
ThWu/filtered_nectar | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: model
dtype: string
- name: rank
dtype: float64
- name: turns
dtype: int64
- name: num_responses
dtype: int64
- name: source
sequence: string
- name: good_natured
dtype: bool
splits:
- name: train
num_bytes: 1203987935.0543852
num_examples: 182470
download_size: 519016885
dataset_size: 1203987935.0543852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "filtered_nectar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibivibiv/variety-logic-training | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
- name: text
dtype: string
- name: question
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 117622037
num_examples: 110214
download_size: 24688336
dataset_size: 117622037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
language:
- en
pretty_name: A Variety of Combined Logic Data
---
This is just a concatenation of several other data sets mostly converted to Alpaca style prompts to help give a good logic data set to settle some models or fine tune. |
taskydata/realtasky | ---
language:
- en
---
|Dataset|Bytes|Samples|Capping|
|-------|-----|-------|-------|
|[Unnatural Instructions](https://huggingface.co/datasets/mrm8488/unnatural-instructions-full) | 27M | 66010 | / |
|[Big-Bench](https://huggingface.co/datasets/bigbench) | 1.7G | 2631238| / |
|[FLAN](https://huggingface.co/datasets/Muennighoff/flan) | 3.1G | 3354260 | [30K examples per dataset max with 10 templates total (So 3K / template)](https://github.com/Muennighoff/FLAN/blob/main/flan/tasks.py) |
|[SuperNatural-Instructions](https://huggingface.co/datasets/Muennighoff/natural-instructions) | 7.4G | 7101558 | / |
|[StackOverflow](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_titlebody_best_voted_answer_jsonl) | 9.0G | 4730542 | / |
|[xP3-EN](https://huggingface.co/datasets/bigscience/xP3) | 37G | 31495184 | [100K examples per data subset per prompt allowed (So 100K / template)](https://github.com/bigscience-workshop/bigscience/blob/e848657707a549dda35c8b3cc63a96d2064b2983/data/xp3/prepare_xp3_train.py#L15) |
|Total|58GB|49378792|
|
LahiruLowe/cot_explanation_targets_vilsonrodrigues_falcon7b_instructsharded | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 34217
num_examples: 36
download_size: 16913
dataset_size: 34217
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cot_explanation_targets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jdnvn/menu-items-allmenus | ---
license: apache-2.0
---
|
PY007/tokenized_slim6B_train_neox_4096 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 22456410848
num_examples: 1370296
download_size: 9712660598
dataset_size: 22456410848
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Seongill/NQ_5_missing_adv | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: has_answer
dtype: bool
- name: similar_sub
dtype: string
- name: ctxs
list:
- name: answer_sent
sequence: string
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: is_adv
dtype: bool
- name: new_answer_sent
dtype: string
- name: original_text
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: status
dtype: string
splits:
- name: train
num_bytes: 14863743
num_examples: 3610
download_size: 8082600
dataset_size: 14863743
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vidhikatkoria/DA_Restaurants | ---
dataset_info:
features:
- name: domain
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: act
dtype: int64
- name: speaker
dtype: int64
- name: generated
dtype: string
splits:
- name: train
num_bytes: 1064689
num_examples: 3588
download_size: 452653
dataset_size: 1064689
---
# Dataset Card for "DA_Restaurants"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__guess-en_3-fcaae9-2012466612 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: facebook/opt-13b
metrics: []
dataset_name: futin/guess
dataset_config: en_3
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-13b
* Dataset: futin/guess
* Config: en_3
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
shahules786/PoetryFoundationData | ---
dataset_info:
features:
- name: poem name
dtype: string
- name: content
dtype: string
- name: author
dtype: string
- name: type
dtype: string
- name: age
dtype: 'null'
splits:
- name: train
num_bytes: 23187576
num_examples: 13854
download_size: 14466446
dataset_size: 23187576
---
This file contains nearly all poems from the [Poetry Foundation Website](https://www.poetryfoundation.org/).
Content
All poems have a title and author. Most poems are also labeled with the tags as available from the Poetry Foundation Website. The word cloud above shows the most used tags!
Inspiration
This dataset can be used for a variety of tasks related to poetry writing.
|
notrichardren/truthfulness_legacy | ---
license: apache-2.0
dataset_info:
features:
- name: claim
dtype: string
- name: label
dtype: int64
- name: explanation
dtype: string
- name: common_knowledge_label
dtype: float64
- name: origin_dataset
dtype: string
splits:
- name: train
num_bytes: 28377892
num_examples: 210326
download_size: 12100978
dataset_size: 28377892
---
|
declare-lab/TangoPromptBank | ---
license: mit
size_categories:
- 1M<n<10M
---
# Project Links
[Github](https://github.com/declare-lab/tango)
[Web](https://tango-web.github.io/)
[Huggingface Space](https://huggingface.co/spaces/declare-lab/tango)
# Dataset Description
This dataset was used to Pre-train [Tango-Full-FT-Audiocaps](https://huggingface.co/declare-lab/tango-full-ft-audiocaps). **TangoPromptBank** is a diverse corpus consisting of textual prompts and audio samples sourced from WavCaps [1], AudioCaps [9], ESC [2], UrbanSound [3], MusicCaps [4], GTZAN [5], and Musical Instruments [6] dataset. The dataset statistics are reported in Table 1. All audio clips longer than 10 seconds were segmented into partitions of successive 10 seconds or shorter. We also resampled all audio clips to 16KHz.
The WavCaps dataset consists of ChatGPT-generated captions for the FreeSound [7], BBC Sound Effects [8] (SFX), and the AudioSet strongly labeled subset. The Urban Sound and ESC50 datasets contain various environmental sounds. The Musical Instruments dataset contains sounds of guitar, drum, violin, and piano instruments. The GTZAN dataset contains sounds of different musical genres -- classical, jazz, etc. These four datasets -- Urban Sound, ESC50, Musical Instruments, GTZAN are audio classification datasets. We use the classification label (e.g., *piano*) and a more natural prompt (*sound of piano*) to create two different training instances for each audio sample from these datasets.
[1]: [WavCaps](https://arxiv.org/abs/2303.17395) [2]: [ESC](http://dl.acm.org/citation.cfm?doid=2733373.2806390)
[3]: [UrbanSound](https://dl.acm.org/doi/10.1145/2647868.2655045)
[4]: [MusicCaps](https://arxiv.org/abs/2301.11325)
[5]: [GTZAN](https://ieeexplore.ieee.org/document/1021072)
[6]: [Musical Instruments Dataset](https://www.kaggle.com/datasets/soumendraprasad/musical-instruments-sound-dataset)
[7]: [FreeSound](https://freesound.org/)
[8]: [BBC Sound Effects](https://sound-effects.bbcrewind.co.uk) [9]: [AudioCaps](https://aclanthology.org/N19-1011/)
# Dataset Statistics
| Dataset | Count |
|-------------------------|-------|
| AudioSet Strong | 108K |
| AudioCaps | 45K |
| Freesound | 680K |
| BBC | 374K |
| Urban Sound | 17K |
| Musical Instrument | 10K |
| MusicCaps | 10K |
| Gtzan Music Genre | 6K |
| ESC50 | 4K |
| **Total** | **1.2M** |
# Baseline Results using TangoPromptBank for Pre-training
| **Model** | **Datasets** | **Dataset Size** | **#Params** | **FD ↓** | **KL ↓** |
| --- | --- | --- | --- | --- | --- |
| [**Tango-Full-FT-Audiocaps**](https://huggingface.co/declare-lab/tango-full-ft-audiocaps) | AS+AC+7 others | 1.2M | 866M | **18.93** | **1.12** |
# Citation
Please consider citing the following article if you found our work useful:
```bibtex
@article{ghosal2023tango,
title={Text-to-Audio Generation using Instruction Tuned LLM and Latent Diffusion Model},
author={Ghosal, Deepanway and Majumder, Navonil and Mehrish, Ambuj and Poria, Soujanya},
journal={arXiv preprint arXiv:2304.13731},
year={2023}
}
``` |
indonlp/nusaparagraph_topic | ---
license: apache-2.0
---
|
ahadda5/sanad | ---
license: apache-2.0
---
|
thechaingamer/ada-git-code | ---
license: mit
---
|
aneeshas/imsdb-genre-movie-scripts | ---
dataset_info:
features:
- name: Action
dtype: string
- name: Horror
dtype: string
- name: Sci-Fi
dtype: string
- name: Comedy
dtype: string
- name: Drama
dtype: string
splits:
- name: train
num_bytes: 180531797
num_examples: 150
download_size: 80225374
dataset_size: 180531797
---
# Dataset Card for "imsdb-genre-movie-scripts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
charlesmichaelvaughn/charlesmichaelvaughn | ---
license: apache-2.0
---
|
ovior/twitter_dataset_1713016490 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2357018
num_examples: 7145
download_size: 1342754
dataset_size: 2357018
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Francesco/paper-parts | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': paper-parts
'1': author
'2': chapter
'3': equation
'4': equation number
'5': figure
'6': figure caption
'7': footnote
'8': list of content heading
'9': list of content text
'10': page number
'11': paragraph
'12': reference text
'13': section
'14': subsection
'15': subsubsection
'16': table
'17': table caption
'18': table of contents text
'19': title
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: paper-parts
tags:
- rf100
---
# Dataset Card for paper-parts
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/paper-parts
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
paper-parts
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/paper-parts
### Citation Information
```
@misc{ paper-parts,
title = { paper parts Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/paper-parts } },
url = { https://universe.roboflow.com/object-detection/paper-parts },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
Sushi123/EdexcelBiologyGCSE | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: __index_level_0__
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 549642
num_examples: 845
download_size: 292686
dataset_size: 549642
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Chara-Ann/Dazai_h0 | ---
license: artistic-2.0
---
|
open-llm-leaderboard/details_chatty123__mistral_rank16_dpo | ---
pretty_name: Evaluation run of chatty123/mistral_rank16_dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chatty123/mistral_rank16_dpo](https://huggingface.co/chatty123/mistral_rank16_dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chatty123__mistral_rank16_dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T18:37:13.102672](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank16_dpo/blob/main/results_2024-04-15T18-37-13.102672.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6030612676628558,\n\
\ \"acc_stderr\": 0.03332071454311037,\n \"acc_norm\": 0.6076501456612151,\n\
\ \"acc_norm_stderr\": 0.033996312981612854,\n \"mc1\": 0.5238678090575275,\n\
\ \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6829506332175648,\n\
\ \"mc2_stderr\": 0.015252914140641184\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403084,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491885\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n\
\ \"acc_stderr\": 0.004714386376337135,\n \"acc_norm\": 0.8497311292571201,\n\
\ \"acc_norm_stderr\": 0.0035660447773274207\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6806451612903226,\n \"acc_stderr\": 0.026522709674667768,\n \"\
acc_norm\": 0.6806451612903226,\n \"acc_norm_stderr\": 0.026522709674667768\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646826,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646826\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077785,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077785\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.0253052581318797,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.0253052581318797\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.012638223880313161,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.012638223880313161\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.01970687580408564,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.01970687580408564\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5238678090575275,\n\
\ \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6829506332175648,\n\
\ \"mc2_stderr\": 0.015252914140641184\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836675\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3995451099317665,\n \
\ \"acc_stderr\": 0.013491660298815995\n }\n}\n```"
repo_url: https://huggingface.co/chatty123/mistral_rank16_dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|arc:challenge|25_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|gsm8k|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hellaswag|10_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-37-13.102672.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T18-37-13.102672.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- '**/details_harness|winogrande|5_2024-04-15T18-37-13.102672.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T18-37-13.102672.parquet'
- config_name: results
data_files:
- split: 2024_04_15T18_37_13.102672
path:
- results_2024-04-15T18-37-13.102672.parquet
- split: latest
path:
- results_2024-04-15T18-37-13.102672.parquet
---
# Dataset Card for Evaluation run of chatty123/mistral_rank16_dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chatty123/mistral_rank16_dpo](https://huggingface.co/chatty123/mistral_rank16_dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chatty123__mistral_rank16_dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T18:37:13.102672](https://huggingface.co/datasets/open-llm-leaderboard/details_chatty123__mistral_rank16_dpo/blob/main/results_2024-04-15T18-37-13.102672.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6030612676628558,
"acc_stderr": 0.03332071454311037,
"acc_norm": 0.6076501456612151,
"acc_norm_stderr": 0.033996312981612854,
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6829506332175648,
"mc2_stderr": 0.015252914140641184
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403084,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491885
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337135,
"acc_norm": 0.8497311292571201,
"acc_norm_stderr": 0.0035660447773274207
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667768,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667768
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646826,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646826
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077785,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.0253052581318797,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.0253052581318797
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913048,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913048
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313161,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313161
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.01970687580408564,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.01970687580408564
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6829506332175648,
"mc2_stderr": 0.015252914140641184
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836675
},
"harness|gsm8k|5": {
"acc": 0.3995451099317665,
"acc_stderr": 0.013491660298815995
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Leon-LLM/Leon-Chess-Dataset-1M-BOS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 554459020
num_examples: 1028170
download_size: 282676393
dataset_size: 554459020
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Leon-Chess-Dataset-1M-BOS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/south_dakota_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of south_dakota/サウスダコタ/南达科他 (Azur Lane)
This is the dataset of south_dakota/サウスダコタ/南达科他 (Azur Lane), containing 224 images and their tags.
The core tags of this character are `long_hair, breasts, dark_skin, dark-skinned_female, black_hair, large_breasts, braid, hair_between_eyes, brown_eyes, hair_ornament, yellow_eyes, feather_hair_ornament, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 224 | 287.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/south_dakota_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 224 | 166.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/south_dakota_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 537 | 340.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/south_dakota_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 224 | 256.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/south_dakota_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 537 | 470.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/south_dakota_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/south_dakota_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, feathers, native_american, solo, cleavage, bare_shoulders, looking_at_viewer, blush, crop_top, necklace, closed_mouth, upper_body, collarbone, navel, simple_background, areola_slip |
| 1 | 35 |  |  |  |  |  | 1girl, feathers, native_american, crop_top, bare_shoulders, necklace, solo, short_shorts, cleavage, looking_at_viewer, thighhighs, navel, midriff, black_shorts, bracelet, blush, machinery, simple_background |
| 2 | 22 |  |  |  |  |  | 1girl, bare_shoulders, earrings, solo, white_dress, armlet, evening_gown, looking_at_viewer, blush, brown_hair, backless_dress, cleavage, official_alternate_costume, smile, ass, simple_background |
| 3 | 5 |  |  |  |  |  | 1girl, backless_dress, bare_shoulders, looking_at_viewer, looking_back, sitting, white_dress, armlet, ass, feather_earrings, from_behind, grand_piano, sideboob, solo, blush, high_heels, sheet_music, black_cat, full_body, halterneck, official_alternate_costume, simple_background, white_background |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, pussy, single_braid, smile, dark_nipples, navel, open_mouth, penis, sex, spread_legs, vaginal, bar_censor, completely_nude, dress, feathers, jewelry, mosaic_censoring, native_american, on_side, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | feathers | native_american | solo | cleavage | bare_shoulders | looking_at_viewer | blush | crop_top | necklace | closed_mouth | upper_body | collarbone | navel | simple_background | areola_slip | short_shorts | thighhighs | midriff | black_shorts | bracelet | machinery | earrings | white_dress | armlet | evening_gown | brown_hair | backless_dress | official_alternate_costume | smile | ass | looking_back | sitting | feather_earrings | from_behind | grand_piano | sideboob | high_heels | sheet_music | black_cat | full_body | halterneck | white_background | 1boy | hetero | solo_focus | pussy | single_braid | dark_nipples | open_mouth | penis | sex | spread_legs | vaginal | bar_censor | completely_nude | dress | jewelry | mosaic_censoring | on_side |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:------------------|:-------|:-----------|:-----------------|:--------------------|:--------|:-----------|:-----------|:---------------|:-------------|:-------------|:--------|:--------------------|:--------------|:---------------|:-------------|:----------|:---------------|:-----------|:------------|:-----------|:--------------|:---------|:---------------|:-------------|:-----------------|:-----------------------------|:--------|:------|:---------------|:----------|:-------------------|:--------------|:--------------|:-----------|:-------------|:--------------|:------------|:------------|:-------------|:-------------------|:-------|:---------|:-------------|:--------|:---------------|:---------------|:-------------|:--------|:------|:--------------|:----------|:-------------|:------------------|:--------|:----------|:-------------------|:----------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 22 |  |  |  |  |  | X | | | X | X | X | X | X | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | X | X | X | | | | | | | X | | | | | | | | | X | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
hmao/reformatted_singleapi_openai | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: api_name
dtype: string
- name: api_definition
dtype: string
- name: dataset_name
dtype: string
splits:
- name: train
num_bytes: 21189
num_examples: 14
download_size: 14297
dataset_size: 21189
---
# Dataset Card for "reformatted_singleapi_openai"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/league_faces_captioned_priors_fast | ---
dataset_info:
features:
- name: splash
dtype: image
- name: tile
dtype: image
- name: label
dtype: string
- name: caption
dtype: string
- name: PRIOR_0
dtype: image
- name: PRIOR_1
dtype: image
- name: PRIOR_2
dtype: image
- name: PRIOR_3
dtype: image
- name: PRIOR_4
dtype: image
splits:
- name: train
num_bytes: 110849850.0
num_examples: 50
download_size: 110857003
dataset_size: 110849850.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/find_second_sent_train_30_eval_10_hint5 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 90400
num_examples: 70
- name: validation
num_bytes: 11329
num_examples: 10
download_size: 64865
dataset_size: 101729
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_second_sent_train_30_eval_10_hint5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marcus2000/legal_dataset2023 | ---
dataset_info:
features:
- name: '0'
dtype: string
- name: '1'
dtype: string
splits:
- name: train
num_bytes: 110824374
num_examples: 1723
- name: test
num_bytes: 21065187
num_examples: 306
download_size: 41312472
dataset_size: 131889561
---
# Dataset Card for "legal_dataset2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_automerger__ShadowYam-7B | ---
pretty_name: Evaluation run of automerger/ShadowYam-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [automerger/ShadowYam-7B](https://huggingface.co/automerger/ShadowYam-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__ShadowYam-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T04:40:02.904834](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__ShadowYam-7B/blob/main/results_2024-03-11T04-40-02.904834.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6512440185687127,\n\
\ \"acc_stderr\": 0.03206829304384349,\n \"acc_norm\": 0.6505187714846713,\n\
\ \"acc_norm_stderr\": 0.03274069891411406,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7804977896814992,\n\
\ \"mc2_stderr\": 0.01369433917187934\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653884,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136438\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.713802031467835,\n\
\ \"acc_stderr\": 0.004510593395289895,\n \"acc_norm\": 0.8906592312288388,\n\
\ \"acc_norm_stderr\": 0.0031142850772280365\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n\
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903343,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903343\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657474,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657474\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806318,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7804977896814992,\n\
\ \"mc2_stderr\": 0.01369433917187934\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.0100992082460656\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \
\ \"acc_stderr\": 0.012731710925078132\n }\n}\n```"
repo_url: https://huggingface.co/automerger/ShadowYam-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|arc:challenge|25_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|gsm8k|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hellaswag|10_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-40-02.904834.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T04-40-02.904834.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- '**/details_harness|winogrande|5_2024-03-11T04-40-02.904834.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T04-40-02.904834.parquet'
- config_name: results
data_files:
- split: 2024_03_11T04_40_02.904834
path:
- results_2024-03-11T04-40-02.904834.parquet
- split: latest
path:
- results_2024-03-11T04-40-02.904834.parquet
---
# Dataset Card for Evaluation run of automerger/ShadowYam-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [automerger/ShadowYam-7B](https://huggingface.co/automerger/ShadowYam-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_automerger__ShadowYam-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T04:40:02.904834](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__ShadowYam-7B/blob/main/results_2024-03-11T04-40-02.904834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6512440185687127,
"acc_stderr": 0.03206829304384349,
"acc_norm": 0.6505187714846713,
"acc_norm_stderr": 0.03274069891411406,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7804977896814992,
"mc2_stderr": 0.01369433917187934
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653884,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136438
},
"harness|hellaswag|10": {
"acc": 0.713802031467835,
"acc_stderr": 0.004510593395289895,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280365
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903343,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903343
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268584,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657474,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657474
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806318,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7804977896814992,
"mc2_stderr": 0.01369433917187934
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.0100992082460656
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078132
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yzhuang/metatree_cpu_act | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1075736
num_examples: 5722
- name: validation
num_bytes: 464360
num_examples: 2470
download_size: 888030
dataset_size: 1540096
---
# Dataset Card for "metatree_cpu_act"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fakermiya/10k-sfw-nsfw | ---
license: gpl-3.0
---
|
julien-c/autotrain-dreambooth-marsupilami-data | ---
license: openrail
task_categories:
- image-to-image
tags:
- marsupilami
- not-for-all-eyes
size_categories:
- n<1K
---
Dataset of a few Marsupilami pictures
PS/ I used git+ssh to push this commit to the Hub 🔥
Thank you @XCiD and @sbrandeis |
pvduy/evol_70k_with_output_Xwin | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 165817770
num_examples: 70000
download_size: 79750128
dataset_size: 165817770
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "evol_70k_with_output_Xwin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zouharvi/bio-mqm-dataset | ---
license: apache-2.0
language:
- en
- de
- es
- eu
- fr
- it
- pt
- ru
- zh
task_categories:
- translation
pretty_name: Biomedical MQM Dataset
size_categories:
- 10K<n<100K
tags:
- mqm
- quality
- bio
- medical
---
This dataset is compiled from the official [Amazon repository](https://github.com/amazon-science/bio-mqm-dataset) (all respective licensing applies) and accompanies the paper _Fine-Tuned Machine Translation Metrics Struggle in Unseen Domains._
It contains system translations, multiple references, and their quality evaluation on the MQM scale. It accompanies the paper [Fine-Tuned Machine Translation Metrics Struggle in Unseen Domains](https://arxiv.org/abs/2402.18747).
> **Abstract:** We introduce a new, extensive multidimensional quality metrics (MQM) annotated dataset covering 11 language pairs in the biomedical domain. We use this dataset to investigate whether machine translation (MT) metrics which are fine-tuned on human-generated MT quality judgements are robust to domain shifts between training and inference. We find that fine-tuned metrics exhibit a substantial performance drop in the unseen domain scenario relative to metrics that rely on the surface form, as well as pre-trained metrics which are not fine-tuned on MT quality judgments.
Example segment:
```
{
"src": "From 2004 to 03/2020, overall 449 pats. underwent EUS-guided cholangiodrainage (n = 37 pats. with cholangiolithiasis).",
"tgt": "Von 2004 bis 03/2020 wurden insgesamt 449 Pat. einer EUS-gesteuerten Cholangiodrainage unterzogen (n = 37 Pat. mit Cholangiolithiasis).",
"ref": [
"Von 2004 bis 03/2020 wurden insgesamt 449 Pat. einer EUS-gestützten Gallenwegdrainage unterzogen (n = 37 Pat. mit Cholangiolithiasis).",
"Von 2004 bis 03/2020 wurden insgesamt 449 Pat. einer EUS-gestützten Gallenwegdrainage unterzogen (n = 37 Pat. mit Cholangiolithiasis)."
],
"system": "HuaweiTSC_run1",
"lang_src": "en", "lang_tgt": "de",
"annotator": "RH1/ende",
"errors_src": [],
"errors_tgt": [
{"term": "03/2020", "startIndex": 13, "endIndex": 19, "error_category": "Locale_conventions", "error_subcategory": "Date_format", "severity": "Minor"},
{"term": "Cholangiolithiasis", "startIndex": 115, "endIndex": 132, "error_category": "Accuracy", "error_subcategory": "Mistranslation", "severity": "Minor"}
],
"doc_id": "doc42"
}
```
If you use this dataset, please cite [the paper](https://arxiv.org/abs/2402.18747).
```
@misc{zouhar2024finetuned,
title={Fine-Tuned Machine Translation Metrics Struggle in Unseen Domains},
author={Vilém Zouhar and Shuoyang Ding and Anna Currey and Tatyana Badeka and Jenyuan Wang and Brian Thompson},
year={2024},
eprint={2402.18747},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
Nexdata/1200_Videos_Potholed_Road_Collection_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
1,200 Videos – Potholed Road Collection Data. The videos last between 7 and 15 seconds. The colleection devide is 360 automobile data recorder, the videos resolution is 2,560*1,440. The data diversity includes different potholed roads, multiple scenes. The collection time is day. The data can be used for tasks such as potholed road detection and recognition.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1317?source=Huggingface
## Data size
1,200 videos, the videos last between 7 and 15 seconds
## Collecting environment
potholed road
## Data diversity
including different potholed roads, multiple scenes
## Device
360 automobile data recorder, the videos resolution is 2,560*1,440
## Photographic angle
eye-level angle
## Collecting time
day
## Data format
the video data format is .mp4
## Annotation content
potholed road data under different road scenarios were collected
## Accuracy rate
according to the collection content, the collecting accuracy is over 97%
# Licensing Information
Commercial License
|
sanchit-gandhi/concatenated-train-set | ---
dataset_info:
config_name: train
features:
- name: id
dtype: string
- name: text
dtype: string
- name: input_features
dtype: image
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 3567367447670.0
num_examples: 2320189
download_size: 2142675924205
dataset_size: 3567367447670.0
configs:
- config_name: train
data_files:
- split: train
path: train/train-*
---
|
yzhuang/metatree_BNG_heart_statlog_ | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 86832736
num_examples: 700264
- name: validation
num_bytes: 37167264
num_examples: 299736
download_size: 65505966
dataset_size: 124000000
---
# Dataset Card for "metatree_BNG_heart_statlog_"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft | ---
pretty_name: Evaluation run of h2oai/h2o-danube-1.8b-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2o-danube-1.8b-sft](https://huggingface.co/h2oai/h2o-danube-1.8b-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T22:54:49.142615](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft/blob/main/results_2024-02-01T22-54-49.142615.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3428374590678907,\n\
\ \"acc_stderr\": 0.033339599143861524,\n \"acc_norm\": 0.34426324885865267,\n\
\ \"acc_norm_stderr\": 0.03407406752430132,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826854,\n \"mc2\": 0.4028619731190418,\n\
\ \"mc2_stderr\": 0.01428278746898766\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.37372013651877134,\n \"acc_stderr\": 0.014137708601759098,\n\
\ \"acc_norm\": 0.40187713310580203,\n \"acc_norm_stderr\": 0.01432726861457827\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49790878311093406,\n\
\ \"acc_stderr\": 0.004989737768749943,\n \"acc_norm\": 0.6733718382792272,\n\
\ \"acc_norm_stderr\": 0.004680215003395913\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361061,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361061\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3886792452830189,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.3886792452830189,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416545,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416545\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234092,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234092\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.027869320571664635,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.027869320571664635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617732,\n\
\ \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617732\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.038783721137112745,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.038783721137112745\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.035402943770953675,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.035402943770953675\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.03555300319557673,\n\
\ \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.03555300319557673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.023290888053772735,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.023290888053772735\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372153,\n\
\ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372153\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.344954128440367,\n \"acc_stderr\": 0.020380605405066962,\n \"\
acc_norm\": 0.344954128440367,\n \"acc_norm_stderr\": 0.020380605405066962\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03308611113236436,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03308611113236436\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4430379746835443,\n \"acc_stderr\": 0.03233532777533484,\n \
\ \"acc_norm\": 0.4430379746835443,\n \"acc_norm_stderr\": 0.03233532777533484\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.3811659192825112,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.47107438016528924,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.47107438016528924,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258975,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258975\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3717948717948718,\n\
\ \"acc_stderr\": 0.031660988918880785,\n \"acc_norm\": 0.3717948717948718,\n\
\ \"acc_norm_stderr\": 0.031660988918880785\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4648786717752235,\n\
\ \"acc_stderr\": 0.01783579880629064,\n \"acc_norm\": 0.4648786717752235,\n\
\ \"acc_norm_stderr\": 0.01783579880629064\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2976878612716763,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.2976878612716763,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3790849673202614,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.3790849673202614,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3549382716049383,\n \"acc_stderr\": 0.026624152478845853,\n\
\ \"acc_norm\": 0.3549382716049383,\n \"acc_norm_stderr\": 0.026624152478845853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460997,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460997\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2816166883963494,\n\
\ \"acc_stderr\": 0.011487783272786696,\n \"acc_norm\": 0.2816166883963494,\n\
\ \"acc_norm_stderr\": 0.011487783272786696\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3034825870646766,\n\
\ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.3034825870646766,\n\
\ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529917,\n\
\ \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529917\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826854,\n \"mc2\": 0.4028619731190418,\n\
\ \"mc2_stderr\": 0.01428278746898766\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.01336659695193438\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1508718726307809,\n \
\ \"acc_stderr\": 0.009859004137305689\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2o-danube-1.8b-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|arc:challenge|25_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|gsm8k|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hellaswag|10_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T22-54-49.142615.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- '**/details_harness|winogrande|5_2024-02-01T22-54-49.142615.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T22-54-49.142615.parquet'
- config_name: results
data_files:
- split: 2024_02_01T22_54_49.142615
path:
- results_2024-02-01T22-54-49.142615.parquet
- split: latest
path:
- results_2024-02-01T22-54-49.142615.parquet
---
# Dataset Card for Evaluation run of h2oai/h2o-danube-1.8b-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [h2oai/h2o-danube-1.8b-sft](https://huggingface.co/h2oai/h2o-danube-1.8b-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T22:54:49.142615](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft/blob/main/results_2024-02-01T22-54-49.142615.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3428374590678907,
"acc_stderr": 0.033339599143861524,
"acc_norm": 0.34426324885865267,
"acc_norm_stderr": 0.03407406752430132,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826854,
"mc2": 0.4028619731190418,
"mc2_stderr": 0.01428278746898766
},
"harness|arc:challenge|25": {
"acc": 0.37372013651877134,
"acc_stderr": 0.014137708601759098,
"acc_norm": 0.40187713310580203,
"acc_norm_stderr": 0.01432726861457827
},
"harness|hellaswag|10": {
"acc": 0.49790878311093406,
"acc_stderr": 0.004989737768749943,
"acc_norm": 0.6733718382792272,
"acc_norm_stderr": 0.004680215003395913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.042667634040995814,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.042667634040995814
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361061,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361061
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3886792452830189,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.3886792452830189,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416545,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416545
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234092,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234092
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4,
"acc_stderr": 0.027869320571664635,
"acc_norm": 0.4,
"acc_norm_stderr": 0.027869320571664635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617732,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617732
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.038783721137112745,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.038783721137112745
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.035402943770953675,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.035402943770953675
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41450777202072536,
"acc_stderr": 0.03555300319557673,
"acc_norm": 0.41450777202072536,
"acc_norm_stderr": 0.03555300319557673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.023290888053772735,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.023290888053772735
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.029213549414372153,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.029213549414372153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.344954128440367,
"acc_stderr": 0.020380605405066962,
"acc_norm": 0.344954128440367,
"acc_norm_stderr": 0.020380605405066962
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4430379746835443,
"acc_stderr": 0.03233532777533484,
"acc_norm": 0.4430379746835443,
"acc_norm_stderr": 0.03233532777533484
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.47107438016528924,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.47107438016528924,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258975,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.031660988918880785,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.031660988918880785
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4648786717752235,
"acc_stderr": 0.01783579880629064,
"acc_norm": 0.4648786717752235,
"acc_norm_stderr": 0.01783579880629064
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2976878612716763,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.2976878612716763,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3790849673202614,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.3790849673202614,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3549382716049383,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.3549382716049383,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460997,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460997
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2816166883963494,
"acc_stderr": 0.011487783272786696,
"acc_norm": 0.2816166883963494,
"acc_norm_stderr": 0.011487783272786696
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3034825870646766,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.3034825870646766,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4269005847953216,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.4269005847953216,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826854,
"mc2": 0.4028619731190418,
"mc2_stderr": 0.01428278746898766
},
"harness|winogrande|5": {
"acc": 0.654301499605367,
"acc_stderr": 0.01336659695193438
},
"harness|gsm8k|5": {
"acc": 0.1508718726307809,
"acc_stderr": 0.009859004137305689
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ai-habitat/hab3_episodes | ---
viewer: false
license: cc-by-nc-4.0
---
# Habitat v0.3.x Episode Datasets and Checkpoints
Episode datasets for Social Navigation and Social Rearrangement tasks. The training dataset has 37k episodes and the evaluation dataset has 1.2k episodes.
In addition, we released a social nav checkpoint trained based on the above episodes. Please read here for more detail: https://github.com/facebookresearch/habitat-lab/tree/main/habitat-baselines
# License Notes:
HSSD assets and episodes are provided under cc-by-nc license as a subset of the dataset described here: https://3dlg-hcvc.github.io/hssd/ |
hojzas/autotrain-data-autotrain-sophie2 | ---
license: apache-2.0
---
|
rpii2023/lallalala | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 8865217
num_examples: 5247
- name: test
num_bytes: 2544613
num_examples: 1500
download_size: 5971582
dataset_size: 11409830
---
# Dataset Card for "lallalala"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-phpthinh__examplei-mismatch-1389aa-1748961037 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/examplei
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-7b1
metrics: ['f1']
dataset_name: phpthinh/examplei
dataset_config: mismatch
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-7b1
* Dataset: phpthinh/examplei
* Config: mismatch
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
mole-code/lancedb | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 3369851
num_examples: 301
- name: test
num_bytes: 95120
num_examples: 12
download_size: 1019675
dataset_size: 3464971
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Boss9xy/tuan2 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_rishiraj__CatPPT-base | ---
pretty_name: Evaluation run of rishiraj/CatPPT-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rishiraj/CatPPT-base](https://huggingface.co/rishiraj/CatPPT-base) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rishiraj__CatPPT-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T19:27:18.909562](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__CatPPT-base/blob/main/results_2023-12-18T19-27-18.909562.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563542070521601,\n\
\ \"acc_stderr\": 0.031988233329583234,\n \"acc_norm\": 0.6566445539278223,\n\
\ \"acc_norm_stderr\": 0.03264710446236585,\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6171834778563777,\n\
\ \"mc2_stderr\": 0.015028199912315715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598677,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946531\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6739693288189603,\n\
\ \"acc_stderr\": 0.004678006403691718,\n \"acc_norm\": 0.8663612826130253,\n\
\ \"acc_norm_stderr\": 0.003395683338056335\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778415,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778415\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289726,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289726\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246572,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.01659802212058043,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.01659802212058043\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182653,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031218,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031218\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.0287951855742913,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.0287951855742913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.6171834778563777,\n\
\ \"mc2_stderr\": 0.015028199912315715\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \
\ \"acc_stderr\": 0.01254183081546149\n }\n}\n```"
repo_url: https://huggingface.co/rishiraj/CatPPT-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|arc:challenge|25_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|gsm8k|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hellaswag|10_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T19-27-18.909562.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T19-27-18.909562.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- '**/details_harness|winogrande|5_2023-12-18T19-27-18.909562.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T19-27-18.909562.parquet'
- config_name: results
data_files:
- split: 2023_12_18T19_27_18.909562
path:
- results_2023-12-18T19-27-18.909562.parquet
- split: latest
path:
- results_2023-12-18T19-27-18.909562.parquet
---
# Dataset Card for Evaluation run of rishiraj/CatPPT-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rishiraj/CatPPT-base](https://huggingface.co/rishiraj/CatPPT-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rishiraj__CatPPT-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T19:27:18.909562](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__CatPPT-base/blob/main/results_2023-12-18T19-27-18.909562.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6563542070521601,
"acc_stderr": 0.031988233329583234,
"acc_norm": 0.6566445539278223,
"acc_norm_stderr": 0.03264710446236585,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.6171834778563777,
"mc2_stderr": 0.015028199912315715
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598677,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946531
},
"harness|hellaswag|10": {
"acc": 0.6739693288189603,
"acc_stderr": 0.004678006403691718,
"acc_norm": 0.8663612826130253,
"acc_norm_stderr": 0.003395683338056335
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778415,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778415
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289726,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289726
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246572,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128136,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.01659802212058043,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.01659802212058043
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182653,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031218,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031218
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.0287951855742913,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.0287951855742913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.6171834778563777,
"mc2_stderr": 0.015028199912315715
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.01254183081546149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Dobt1/DonkeyKong | ---
license: openrail
---
|
Ranjan22/Medium_Articles | ---
license: mit
---
|
shansuryajaya/arabic-architecture | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2921299.0
num_examples: 30
download_size: 2922793
dataset_size: 2921299.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ZoabiTalal/Dataset-Goldbach-1.0 | ---
license: mit
task_categories:
- text-classification
- token-classification
language:
- en
tags:
- code
size_categories:
- 10M<n<100M
--- |
one-sec-cv12/chunk_260 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 17212089744.625
num_examples: 179203
download_size: 14604590618
dataset_size: 17212089744.625
---
# Dataset Card for "chunk_260"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kvriza8/AF_images | ---
license: mit
dataset_info:
features:
- name: caption
dtype: string
- name: caption_summary
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 105820185.375
num_examples: 1861
download_size: 105551754
dataset_size: 105820185.375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hac541309/polyglot-ko-tokenizer-corpus-merge_ws | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 17351410727
num_examples: 11808255
download_size: 4418578989
dataset_size: 17351410727
---
# Dataset Card for "polyglot-ko-tokenizer-corpus-merge_ws"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
linhtran92/asr_data_v3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 3658424.0
num_examples: 44
download_size: 3640862
dataset_size: 3658424.0
---
# Dataset Card for "asr_data_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sourabh2/Abstract_of_article | ---
dataset_info:
features:
- name: abstract
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 9988481
num_examples: 1000
download_size: 3617033
dataset_size: 9988481
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-89000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1118981
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tmnam20/VietnameseMedicalQA-raw | ---
dataset_info:
- config_name: all
features:
- name: document_idx
dtype: int64
- name: section_idx
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: article_url
dtype: string
- name: author_url
dtype: string
- name: author
dtype: string
- name: subsection_idx
dtype: int64
- name: content_idx
dtype: int64
- name: title
dtype: string
- name: keyword
dtype: string
splits:
- name: train
num_bytes: 30983801
num_examples: 32318
download_size: 11456882
dataset_size: 30983801
- config_name: body-part
features:
- name: document_idx
dtype: int64
- name: section_idx
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: article_url
dtype: string
- name: author_url
dtype: string
- name: author
dtype: string
- name: subsection_idx
dtype: int64
- name: content_idx
dtype: int64
- name: title
dtype: string
- name: keyword
dtype: string
splits:
- name: train
num_bytes: 2251827
num_examples: 1894
download_size: 874959
dataset_size: 2251827
- config_name: disease
features:
- name: document_idx
dtype: int64
- name: section_idx
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: article_url
dtype: string
- name: author_url
dtype: string
- name: author
dtype: string
- name: subsection_idx
dtype: int64
- name: content_idx
dtype: int64
- name: title
dtype: string
- name: keyword
dtype: string
splits:
- name: train
num_bytes: 8164010
num_examples: 6616
download_size: 3163801
dataset_size: 8164010
- config_name: drug
features:
- name: document_idx
dtype: int64
- name: section_idx
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: article_url
dtype: string
- name: author_url
dtype: string
- name: author
dtype: string
- name: subsection_idx
dtype: int64
- name: content_idx
dtype: int64
- name: title
dtype: string
- name: keyword
dtype: string
splits:
- name: train
num_bytes: 13120425
num_examples: 15608
download_size: 4451159
dataset_size: 13120425
- config_name: medicine
features:
- name: document_idx
dtype: int64
- name: section_idx
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: article_url
dtype: string
- name: author_url
dtype: string
- name: author
dtype: string
- name: subsection_idx
dtype: int64
- name: content_idx
dtype: int64
- name: title
dtype: string
- name: keyword
dtype: string
splits:
- name: train
num_bytes: 7447539
num_examples: 8200
download_size: 2991259
dataset_size: 7447539
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
default: true
- config_name: body-part
data_files:
- split: train
path: body-part/train-*
- config_name: disease
data_files:
- split: train
path: disease/train-*
- config_name: drug
data_files:
- split: train
path: drug/train-*
- config_name: medicine
data_files:
- split: train
path: medicine/train-*
---
|
huggingartists/ciggy-blacc | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/ciggy-blacc"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 4014.257119 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/7ba8a81d32ea254df43b31447958e85f.500x500x1.png')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/ciggy-blacc">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ciggy Blacc</div>
<a href="https://genius.com/artists/ciggy-blacc">
<div style="text-align: center; font-size: 14px;">@ciggy-blacc</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/ciggy-blacc).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ciggy-blacc")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|23| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/ciggy-blacc")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2022
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
sguo08/ops | ---
task_categories:
- table-question-answering
language:
- zh
tags:
- code
size_categories:
- 100K<n<1M
--- |
open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v3 | ---
pretty_name: Evaluation run of hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3](https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T16:25:35.827277](https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v3/blob/main/results_2024-03-31T16-25-35.827277.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6375746321703655,\n\
\ \"acc_stderr\": 0.03225546197812389,\n \"acc_norm\": 0.6434618962614028,\n\
\ \"acc_norm_stderr\": 0.032904960223920136,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215137349816427,\n\
\ \"mc2_stderr\": 0.014137575959685471\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642476,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946709\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n\
\ \"acc_stderr\": 0.004819367172685962,\n \"acc_norm\": 0.8330013941445927,\n\
\ \"acc_norm_stderr\": 0.0037221237096104645\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464074,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792579,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792579\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559806,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559806\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215137349816427,\n\
\ \"mc2_stderr\": 0.014137575959685471\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37907505686125853,\n \
\ \"acc_stderr\": 0.013363630295088347\n }\n}\n```"
repo_url: https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|arc:challenge|25_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|gsm8k|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hellaswag|10_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-25-35.827277.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T16-25-35.827277.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- '**/details_harness|winogrande|5_2024-03-31T16-25-35.827277.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T16-25-35.827277.parquet'
- config_name: results
data_files:
- split: 2024_03_31T16_25_35.827277
path:
- results_2024-03-31T16-25-35.827277.parquet
- split: latest
path:
- results_2024-03-31T16-25-35.827277.parquet
---
# Dataset Card for Evaluation run of hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3](https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T16:25:35.827277](https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v3/blob/main/results_2024-03-31T16-25-35.827277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6375746321703655,
"acc_stderr": 0.03225546197812389,
"acc_norm": 0.6434618962614028,
"acc_norm_stderr": 0.032904960223920136,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215137349816427,
"mc2_stderr": 0.014137575959685471
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642476,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946709
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685962,
"acc_norm": 0.8330013941445927,
"acc_norm_stderr": 0.0037221237096104645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464074,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792579,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792579
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559806,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215137349816427,
"mc2_stderr": 0.014137575959685471
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.37907505686125853,
"acc_stderr": 0.013363630295088347
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
arsalanaa/oilpaint_datasets | ---
license: unknown
---
|
CyberHarem/mochizuki_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mochizuki (Kantai Collection)
This is the dataset of mochizuki (Kantai Collection), containing 388 images and their tags.
The core tags of this character are `brown_hair, long_hair, glasses, brown_eyes, red-framed_eyewear, semi-rimless_eyewear, under-rim_eyewear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 388 | 282.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochizuki_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 388 | 202.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochizuki_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 847 | 415.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochizuki_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 388 | 264.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochizuki_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 847 | 519.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mochizuki_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mochizuki_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, black_serafuku, solo, looking_at_viewer, black_sailor_collar, crescent_pin, simple_background, white_necktie, neckerchief, white_background, upper_body, long_sleeves, hair_between_eyes, skirt |
| 1 | 7 |  |  |  |  |  | 1girl, black_serafuku, blush, simple_background, solo, white_background, looking_at_viewer, crescent, necktie, pleated_skirt, long_sleeves, twitter_username |
| 2 | 5 |  |  |  |  |  | 1girl, black_serafuku, black_skirt, crescent_pin, long_sleeves, solo, white_necktie, pleated_skirt, white_socks, blush, sailor_collar |
| 3 | 6 |  |  |  |  |  | 1girl, crescent, necktie, skirt, solo, black_serafuku, looking_at_viewer, open_mouth, long_sleeves |
| 4 | 9 |  |  |  |  |  | 1girl, black_serafuku, black_skirt, kneehighs, long_sleeves, solo, white_necktie, crescent_pin, white_socks, black_sailor_collar, full_body, black_shirt, open_mouth, pleated_skirt, simple_background, bangs, brown_footwear, loafers, blush, hair_between_eyes, looking_at_viewer, standing, very_long_hair, white_background |
| 5 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, one-piece_swimsuit, school_swimsuit, solo, crescent, flat_chest, open_mouth, covered_navel, cowboy_shot, twitter_username |
| 6 | 5 |  |  |  |  |  | flat_chest, looking_at_viewer, 1girl, navel, solo, white_bikini, cowboy_shot, side-tie_bikini_bottom, blue_background, cloud, open_mouth, sky, smile |
| 7 | 5 |  |  |  |  |  | black_dress, enmaided, maid_apron, white_apron, 1girl, frilled_apron, looking_at_viewer, maid_headdress, solo, blush, hair_between_eyes, long_sleeves, open_mouth, bangs, chibi, cowboy_shot, crescent_pin, puffy_sleeves, simple_background, smile, white_background |
| 8 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, fellatio, bar_censor, pov, solo_focus, cum_in_mouth, looking_at_viewer, nude, saliva, sweat, tears, veiny_penis, erection, full-face_blush, large_penis, licking, long_sleeves, nipples |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_serafuku | solo | looking_at_viewer | black_sailor_collar | crescent_pin | simple_background | white_necktie | neckerchief | white_background | upper_body | long_sleeves | hair_between_eyes | skirt | blush | crescent | necktie | pleated_skirt | twitter_username | black_skirt | white_socks | sailor_collar | open_mouth | kneehighs | full_body | black_shirt | bangs | brown_footwear | loafers | standing | very_long_hair | one-piece_swimsuit | school_swimsuit | flat_chest | covered_navel | cowboy_shot | navel | white_bikini | side-tie_bikini_bottom | blue_background | cloud | sky | smile | black_dress | enmaided | maid_apron | white_apron | frilled_apron | maid_headdress | chibi | puffy_sleeves | 1boy | hetero | fellatio | bar_censor | pov | solo_focus | cum_in_mouth | nude | saliva | sweat | tears | veiny_penis | erection | full-face_blush | large_penis | licking | nipples |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:--------------------|:----------------------|:---------------|:--------------------|:----------------|:--------------|:-------------------|:-------------|:---------------|:--------------------|:--------|:--------|:-----------|:----------|:----------------|:-------------------|:--------------|:--------------|:----------------|:-------------|:------------|:------------|:--------------|:--------|:-----------------|:----------|:-----------|:-----------------|:---------------------|:------------------|:-------------|:----------------|:--------------|:--------|:---------------|:-------------------------|:------------------|:--------|:------|:--------|:--------------|:-----------|:-------------|:--------------|:----------------|:-----------------|:--------|:----------------|:-------|:---------|:-----------|:-------------|:------|:-------------|:---------------|:-------|:---------|:--------|:--------|:--------------|:-----------|:------------------|:--------------|:----------|:----------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | X | | | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | | X | | X | | | | X | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | X | | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | X | X | | X | | | X | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | X | | | X | | | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | X | | X | X | | | X | | X | X | | X | | | | | | | | X | | | | X | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
BEE-spoke-data/scientificbeekeeping | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
- name: title
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 10438862
num_examples: 471
download_size: 4117007
dataset_size: 10438862
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# scientificbeekeeping
raw webtext |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-55000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 5509152894
num_examples: 1000
download_size: 1125183116
dataset_size: 5509152894
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mozilla-foundation/common_voice_6_0 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
ab:
- n<1K
ar:
- 10K<n<100K
as:
- n<1K
br:
- 10K<n<100K
ca:
- 100K<n<1M
cnh:
- 1K<n<10K
cs:
- 10K<n<100K
cv:
- 10K<n<100K
cy:
- 10K<n<100K
de:
- 100K<n<1M
dv:
- 10K<n<100K
el:
- 10K<n<100K
en:
- 1M<n<10M
eo:
- 10K<n<100K
es:
- 100K<n<1M
et:
- 10K<n<100K
eu:
- 10K<n<100K
fa:
- 100K<n<1M
fi:
- 1K<n<10K
fr:
- 100K<n<1M
fy-NL:
- 10K<n<100K
ga-IE:
- 1K<n<10K
hi:
- n<1K
hsb:
- 1K<n<10K
hu:
- 1K<n<10K
ia:
- 1K<n<10K
id:
- 10K<n<100K
it:
- 100K<n<1M
ja:
- 1K<n<10K
ka:
- 1K<n<10K
kab:
- 100K<n<1M
ky:
- 10K<n<100K
lg:
- 1K<n<10K
lt:
- 1K<n<10K
lv:
- 1K<n<10K
mn:
- 10K<n<100K
mt:
- 10K<n<100K
nl:
- 10K<n<100K
or:
- 1K<n<10K
pa-IN:
- 1K<n<10K
pl:
- 100K<n<1M
pt:
- 10K<n<100K
rm-sursilv:
- 1K<n<10K
rm-vallader:
- 1K<n<10K
ro:
- 1K<n<10K
ru:
- 10K<n<100K
rw:
- 1M<n<10M
sah:
- 1K<n<10K
sl:
- 1K<n<10K
sv-SE:
- 10K<n<100K
ta:
- 10K<n<100K
th:
- 10K<n<100K
tr:
- 10K<n<100K
tt:
- 10K<n<100K
uk:
- 10K<n<100K
vi:
- 1K<n<10K
vot:
- n<1K
zh-CN:
- 10K<n<100K
zh-HK:
- 10K<n<100K
zh-TW:
- 10K<n<100K
source_datasets:
- extended|common_voice
paperswithcode_id: common-voice
pretty_name: Common Voice Corpus 6.0
language_bcp47:
- ab
- ar
- as
- br
- ca
- cnh
- cs
- cv
- cy
- de
- dv
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy-NL
- ga-IE
- hi
- hsb
- hu
- ia
- id
- it
- ja
- ka
- kab
- ky
- lg
- lt
- lv
- mn
- mt
- nl
- or
- pa-IN
- pl
- pt
- rm-sursilv
- rm-vallader
- ro
- ru
- rw
- sah
- sl
- sv-SE
- ta
- th
- tr
- tt
- uk
- vi
- vot
- zh-CN
- zh-HK
- zh-TW
extra_gated_prompt: By clicking on “Access repository” below, you also agree to not
attempt to determine the identity of speakers in the Common Voice dataset.
task_categories:
- automatic-speech-recognition
---
# Dataset Card for Common Voice Corpus 6.0
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Anton Lozhkov](mailto:anton@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 9261 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 7327 validated hours in 60 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
### Languages
```
Abkhaz, Arabic, Assamese, Basque, Breton, Catalan, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Dhivehi, Dutch, English, Esperanto, Estonian, Finnish, French, Frisian, Georgian, German, Greek, Hakha Chin, Hindi, Hungarian, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kinyarwanda, Kyrgyz, Latvian, Lithuanian, Luganda, Maltese, Mongolian, Odia, Persian, Polish, Portuguese, Punjabi, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Slovenian, Sorbian, Upper, Spanish, Swedish, Tamil, Tatar, Thai, Turkish, Ukrainian, Vietnamese, Votic, Welsh
```
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_6_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
Neuronovo/neuronovo-utc-data-glue-mnli | ---
dataset_info:
features:
- name: x
dtype: string
- name: y
dtype: int64
- name: label_id
dtype: int64
- name: text
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 492690051
num_examples: 1119828
- name: validation
num_bytes: 25718605
num_examples: 58278
- name: test
num_bytes: 26234868
num_examples: 58941
download_size: 144048422
dataset_size: 544643524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
davanstrien/autotrain-data-onthebooksmodel | Invalid username or password. |
tux/alphafold_issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
dtype: float64
- name: assignees
sequence: 'null'
- name: milestone
dtype: float64
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: 'null'
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 838906
num_examples: 200
download_size: 195220
dataset_size: 838906
---
# Dataset Card for "alphafold_issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Admin0805/Beaconchainproofofstake | ---
license: other
license_name: citibankdemobusiness
license_link: https://citibankdemobusiness.dev
---
|
Nevertree/dataset2modeltest | ---
license: other
---
|
open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2 | ---
pretty_name: Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [beaugogh/Llama2-7b-openorca-mc-v2](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T04:26:13.148346](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2/blob/main/results_2023-10-15T04-26-13.148346.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0030411073825503355,\n\
\ \"em_stderr\": 0.000563889690875318,\n \"f1\": 0.06320574664429535,\n\
\ \"f1_stderr\": 0.0014620292630980185,\n \"acc\": 0.39116058002373183,\n\
\ \"acc_stderr\": 0.00935782744756563\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0030411073825503355,\n \"em_stderr\": 0.000563889690875318,\n\
\ \"f1\": 0.06320574664429535,\n \"f1_stderr\": 0.0014620292630980185\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \
\ \"acc_stderr\": 0.006216328640238128\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893129\n\
\ }\n}\n```"
repo_url: https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|arc:challenge|25_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T04_26_13.148346
path:
- '**/details_harness|drop|3_2023-10-15T04-26-13.148346.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T04-26-13.148346.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T04_26_13.148346
path:
- '**/details_harness|gsm8k|5_2023-10-15T04-26-13.148346.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T04-26-13.148346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hellaswag|10_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T08_24_57.016837
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T08:24:57.016837.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T08:24:57.016837.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T04_26_13.148346
path:
- '**/details_harness|winogrande|5_2023-10-15T04-26-13.148346.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T04-26-13.148346.parquet'
- config_name: results
data_files:
- split: 2023_10_15T04_26_13.148346
path:
- results_2023-10-15T04-26-13.148346.parquet
- split: latest
path:
- results_2023-10-15T04-26-13.148346.parquet
---
# Dataset Card for Evaluation run of beaugogh/Llama2-7b-openorca-mc-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [beaugogh/Llama2-7b-openorca-mc-v2](https://huggingface.co/beaugogh/Llama2-7b-openorca-mc-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T04:26:13.148346](https://huggingface.co/datasets/open-llm-leaderboard/details_beaugogh__Llama2-7b-openorca-mc-v2/blob/main/results_2023-10-15T04-26-13.148346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0030411073825503355,
"em_stderr": 0.000563889690875318,
"f1": 0.06320574664429535,
"f1_stderr": 0.0014620292630980185,
"acc": 0.39116058002373183,
"acc_stderr": 0.00935782744756563
},
"harness|drop|3": {
"em": 0.0030411073825503355,
"em_stderr": 0.000563889690875318,
"f1": 0.06320574664429535,
"f1_stderr": 0.0014620292630980185
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.006216328640238128
},
"harness|winogrande|5": {
"acc": 0.728492501973165,
"acc_stderr": 0.012499326254893129
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Hwangseon/customhscode | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7504
num_examples: 36
download_size: 3322
dataset_size: 7504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FelixdoingAI/IP2P-edit-SSLWM-try-step50-7.5_1.5-200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: original_prompt
dtype: string
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_prompt
dtype: string
- name: edited_image
dtype: image
- name: adversarial_image
dtype: image
- name: edit_adv_image
dtype: image
splits:
- name: train
num_bytes: 90630546.0
num_examples: 200
download_size: 0
dataset_size: 90630546.0
---
# Dataset Card for "IP2P-edit-SSLWM-try-step50-7.5_1.5-200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sharmaraju352/stackoverflow-kubernetes-questions-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 59421164
num_examples: 22832
download_size: 28605854
dataset_size: 59421164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Coder-Dragon/wikipedia-movies | ---
license: apache-2.0
task_categories:
- feature-extraction
language:
- en
tags:
- art
- music
size_categories:
- 10K<n<100K
---
### Wikipedia Movie Plots with Images.
30,000+ movies plot descriptions and images.
Plot summary descriptions of movies scrapped from Wikipedia.
Dataset is subset of this [dataset](https://www.kaggle.com/datasets/jrobischon/wikipedia-movie-plots).
### Content
The dataset contains descriptions of 34,886 movies from around the world. Column descriptions are listed below:
*Release Year* - Year in which the movie was released<br>
*Title* - Movie title<br>
*Origin/Ethnicity* - Origin of movie (i.e. American, Bollywood, Tamil, etc.)<br>
*Director* - Director(s)<br>
*Genre* - Movie Genre(s)<br>
*Plot* - Main actor and actresses<br>
*Wiki Page* - URL of the Wikipedia page from which the plot description was scraped<br>
*Plot* - Long form description of movie plot (WARNING: May contain spoilers)<br>
*Image* - Poster of movie<br>
### Use Case:
*Movie Search by Plots*: https://github.com/shivamarora1/msp |
joey234/mmlu-computer_security-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 4609.04
num_examples: 17
download_size: 6084
dataset_size: 4609.04
---
# Dataset Card for "mmlu-computer_security-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hehe77/llama2_test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AntoineBlanot/xnli-fused | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: language
dtype: string
splits:
- name: train
num_bytes: 1622312699
num_examples: 5890530
- name: validation
num_bytes: 9825139
num_examples: 37350
- name: test
num_bytes: 19908472
num_examples: 75150
download_size: 883019304
dataset_size: 1652046310
---
# Dataset Card for "xnli-fused"
## Dataset Summary
This dataset is the [XNLI](https://huggingface.co/datasets/xnli) dataset where all languages has been fused to a single one for multilingual training. Please refer to the original dataset for more information.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Heng666/Taiwan-patent-qa-eval | ---
dataset_info:
features:
- name: question
dtype: string
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 94331
num_examples: 192
download_size: 55655
dataset_size: 94331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- question-answering
language:
- zh
tags:
- traditional chinese
- patent
- taiwan
pretty_name: taiwan-patent-qa-eval
size_categories:
- n<1K
---
# 台灣專利問答集
我們提出適用於 QA 系統上用的專利問答集,主要內容收錄台灣開發資料,總計八年的專利師訓練試題,高達 192 道題目。旨在提高語言模型在台灣領域上落地場景。
<p align="center">
<img src="https://huggingface.co/datasets/Heng666/Taiwan-patent-qa-eval/resolve/main/Taiwan Patent Q&A Map.webp" style="max-width: 400" width=400 />
</p>
# Citation
```
@article{TaiwanPatent2024eval,
title={An Patent Evaulutaion for Taiwan Language Model},
author={soaring0616, Heng-Shiou Sheu},
journal={arXiv},
year={2024}
}
```
|
Columbia-NLP/ProLex | ---
license: apache-2.0
dataset_info:
features:
- name: target word
dtype: string
- name: Sentence
dtype: string
- name: acc_subs
dtype: string
- name: unacc_subs
dtype: string
- name: prof_acc_subs
dtype: string
- name: prof_unacc_subs
dtype: string
- name: t_words_cefr
dtype: int64
- name: prof_acc_cefr
dtype: string
- name: prof_unacc_cefr
dtype: string
splits:
- name: train
num_bytes: 191230
num_examples: 680
download_size: 115218
dataset_size: 191230
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.