datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Asap7772/persona_sft | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: model
dtype: string
- name: x
dtype: string
- name: y
dtype: string
splits:
- name: train
num_bytes: 327567.3088235294
num_examples: 183
- name: test
num_bytes: 37589.69117647059
num_examples: 21
download_size: 0
dataset_size: 365157.0
---
# Dataset Card for "persona_sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_double_modals | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 52950
num_examples: 239
- name: test
num_bytes: 25962
num_examples: 145
- name: train
num_bytes: 91268
num_examples: 411
download_size: 117479
dataset_size: 170180
---
# Dataset Card for "MULTI_VALUE_stsb_double_modals"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
canberra/operation-neptune | ---
license: mit
---
|
Nielser/minithresh | ---
license: afl-3.0
---
|
autoevaluate/autoeval-eval-joelito__brazilian_court_decisions-joelito__brazilian_c-4bed1b-1985466168 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- joelito/brazilian_court_decisions
eval_info:
task: multi_class_classification
model: Luciano/bertimbau-base-finetuned-lener-br-finetuned-brazilian_court_decisions
metrics: []
dataset_name: joelito/brazilian_court_decisions
dataset_config: joelito--brazilian_court_decisions
dataset_split: test
col_mapping:
text: decision_description
target: judgment_label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: Luciano/bertimbau-base-finetuned-lener-br-finetuned-brazilian_court_decisions
* Dataset: joelito/brazilian_court_decisions
* Config: joelito--brazilian_court_decisions
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
mayflowergmbh/intel_orca_phi_dpo | ---
license: apache-2.0
---
|
Finnish-NLP/oscar_2301_fi_cleaned | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: warc_headers
struct:
- name: warc-record-id
dtype: string
- name: warc-date
dtype: string
- name: content-type
dtype: string
- name: content-length
dtype: int32
- name: warc-type
dtype: string
- name: warc-identified-content-language
dtype: string
- name: warc-refers-to
dtype: string
- name: warc-target-uri
dtype: string
- name: warc-block-digest
dtype: string
- name: identification
struct:
- name: label
dtype: string
- name: prob
dtype: float32
- name: harmful_pp
dtype: float32
- name: tlsh
dtype: string
- name: quality_warnings
sequence: string
- name: categories
sequence: string
- name: sentence_identifications
list:
- name: label
dtype: string
- name: prob
dtype: float32
- name: perplexity_kenlm
dtype: int64
- name: url
dtype: string
- name: label_identity_attack
dtype: float64
- name: label_insult
dtype: float64
- name: label_obscene
dtype: float64
- name: label_severe_toxicity
dtype: float64
- name: label_threat
dtype: float64
- name: label_toxicity
dtype: float64
splits:
- name: train
num_bytes: 40449678552
num_examples: 5225577
download_size: 2848314172
dataset_size: 40449678552
---
# Dataset Card for "oscar_2301_fi_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OleehyO/latex-formulas | ---
license: openrail
task_categories:
- image-to-text
---
# ๐ฉ๐ฐ๐ฎ ๐ต๐ฌ๐พ๐บโผ๏ธ
๐ฎ [2๐2๐-๐2] We trained a formula recognition model, [๐๐๐ฑ๐๐๐ฅ๐ฅ๐๐ซ](https://github.com/OleehyO/TexTeller?tab=readme-ov-file), using the latex-formulas dataset. It can convert LaTeX formulas into images and boasts **high accuracy** and **strong generalization capabilities**, covering **most formula recognition scenarios**.
> For more details, please refer to the [๐๐๐ฑ๐๐๐ฅ๐ฅ๐๐ซ GitHub repository](https://github.com/OleehyO/TexTeller?tab=readme-ov-file).
# Dataset Description
> [ไธญๆ็ๆฌ](./README_zh.md)
There are two datasets: **raw_formulas** and **cleaned_formulas**(This dataset has **550K formula-image pairs**).
We scraped approximately 1 million LaTeX formula image-text pairs from *arxiv* that were uncleaned and without text segmentation to create the *raw_formulas* dataset. After cleaning the *raw_formulas* dataset and integrating it with the [im2latex-100K](https://zenodo.org/records/56198#.V2px0jXT6eA) dataset, we obtained the *cleaned_formulas* dataset, which has **550K** formula-image pairs.
To render the images corresponding to the formulas, the following external packages are needed:
* amsmath
* amsfonts
* amssymb
* mathtools
## Usage
for **raw_formulas** dataset:
```python
from datasets import load_dataset
data = load_dataset("OleehyO/latex-formulas", "raw_formulas")
```
for **cleaned_formulas** dataset:
```python
from datasets import load_dataset
data = load_dataset("OleehyO/latex-formulas", "cleaned_formulas")
```
## Details About the *raw_formulas* Dataset
We scraped LaTeX formulas containing the following environments:
* equation
* align
* align*
* gather
* gather*
The formulas do not include the following content:
* \label
* %
* \quad
* \qquad
* \vspace
* \hspace
* \resizebox
* \scalebox
* \rotatebox
* \parbox
* \fbox
* \makebox
* \raisebox
* \addvspace
* \hfill
* \vfill
* \textwidth
* \textheight
* \rule
## Preprocessing Details of the *cleaned_formulas* Dataset
### Cleaning
* We removed some useless junk data from both *raw_formulas* and [im2latex-100K](https://zenodo.org/records/56198#.V2px0jXT6eA).
* We deleted overly complex formulas from both *raw_formulas* and [im2latex-100K](https://zenodo.org/records/56198#.V2px0jXT6eA):
* Formulas were deleted if the aspect ratio of the corresponding rendered image was greater than 0.8.
* Formulas with a character length greater than 200 were deleted.
* In the formulas from both *raw_formulas* and [im2latex-100K](https://zenodo.org/records/56198#.V2px0jXT6eA), the following content was removed:
* \tag
* \text
* \begin{split}
* \end{split}
* \nonumber
* \notag
* The `equation`, `equation*`, `align`, `\[...\]` environments in *raw_formulas* were all replaced with the `align*` environment.
* We deleted formulas from *raw_formulas* that contained custom macros.
|
CodecSR/vox_lingua_top10_16k_synth | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
splits:
- name: original
num_bytes: 1739423546.0
num_examples: 972
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 579867274.0
num_examples: 972
- name: academicodec_hifi_24k_320d
num_bytes: 579867274.0
num_examples: 972
- name: audiodec_24k_300d
num_bytes: 580372714.0
num_examples: 972
- name: audiodec_48k_300d_uni
num_bytes: 580372714.0
num_examples: 972
- name: dac_16k
num_bytes: 579867274.0
num_examples: 972
- name: dac_24k
num_bytes: 579867274.0
num_examples: 972
- name: dac_44k
num_bytes: 579867274.0
num_examples: 972
- name: encodec_24k_12bps
num_bytes: 579867274.0
num_examples: 972
- name: encodec_24k_1_5bps
num_bytes: 579867274.0
num_examples: 972
- name: encodec_24k_24bps
num_bytes: 579867274.0
num_examples: 972
- name: encodec_24k_3bps
num_bytes: 579867274.0
num_examples: 972
- name: encodec_24k_6bps
num_bytes: 579867274.0
num_examples: 972
- name: facodec_16k
num_bytes: 579789514.0
num_examples: 972
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 579867274.0
num_examples: 972
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 579867274.0
num_examples: 972
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 579867274.0
num_examples: 972
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 579867274.0
num_examples: 972
- name: language_codec_chinese_24k_nq8_12kbps
num_bytes: 579867274.0
num_examples: 972
- name: language_codec_paper_24k_nq8_12kbps
num_bytes: 579867274.0
num_examples: 972
- name: speech_tokenizer_16k
num_bytes: 579867274.0
num_examples: 972
download_size: 8318807251
dataset_size: 13337702146.0
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_300d
path: data/audiodec_24k_300d-*
- split: audiodec_48k_300d_uni
path: data/audiodec_48k_300d_uni-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: facodec_16k
path: data/facodec_16k-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: language_codec_chinese_24k_nq8_12kbps
path: data/language_codec_chinese_24k_nq8_12kbps-*
- split: language_codec_paper_24k_nq8_12kbps
path: data/language_codec_paper_24k_nq8_12kbps-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
Chaymaa/grdf-inferenceC | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 13421746.731123388
num_examples: 325
- name: test
num_bytes: 4522990.134438306
num_examples: 109
- name: valid
num_bytes: 4576454.134438306
num_examples: 109
download_size: 20381217
dataset_size: 22521191.0
---
# Dataset Card for "grdf-inferenceC"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HumanCompatibleAI/ppo-seals-Walker2d-v0 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float64
splits:
- name: train
num_bytes: 60728770
num_examples: 104
download_size: 21507130
dataset_size: 60728770
---
# Dataset Card for "ppo-seals-Walker2d-v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HarryAJMK418/IDV | ---
license: openrail
---
|
ibranze/araproje_arc_en_conf_mgpt_nearestscore_true | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 80031.0
num_examples: 250
download_size: 46813
dataset_size: 80031.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_en_conf_mgpt_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nathanReitinger/mlcb | ---
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 8132250961
num_examples: 76369
- name: test
num_bytes: 897865830
num_examples: 8486
download_size: 2715307703
dataset_size: 9030116791
---
# Dataset Card for "mlcb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harshiitsingh/flipkart-scraped-dresses-10 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 102203.0
num_examples: 10
download_size: 102337
dataset_size: 102203.0
---
# Dataset Card for "flipkart-scraped-dresses-10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Traditional_Chinese_Oral_Message_Data | ---
task_categories:
- conversational
language:
- zh
---
# Dataset Card for Nexdata/Traditional_Chinese_Oral_Message_Data
## Description
Traditional Chinese SMS corpus, 10 million in total, real traditional Chinese spoken language text data; only contains text messages; the content is stored in txt format; the data set can be used for natural language understanding and related tasks.
For more details, please refer to the link: https://www.nexdata.ai/datasets/182?source=Huggingface
# Specifications
## Data content
Traditional Chinese SMS corpus text data
## Data size
10 million
## Collecting period
The year 2,014
## Storage format
txt
## Language
Chinese
# Licensing Information
Commercial License |
medmac01/uemf_cer_chunked | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: page
dtype: int64
- name: ref
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 315387
num_examples: 555
download_size: 151851
dataset_size: 315387
---
# Dataset Card for "uemf_cer_chunked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zeroshot/twitter-financial-news-topic | ---
annotations_creators:
- other
language:
- en
language_creators:
- other
license:
- mit
multilinguality:
- monolingual
pretty_name: twitter financial news
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- twitter
- finance
- markets
- stocks
- wallstreet
- quant
- hedgefunds
- markets
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
### Dataset Description
The Twitter Financial News dataset is an English-language dataset containing an annotated corpus of finance-related tweets. This dataset is used to classify finance-related tweets for their topic.
1. The dataset holds 21,107 documents annotated with 20 labels:
```python
topics = {
"LABEL_0": "Analyst Update",
"LABEL_1": "Fed | Central Banks",
"LABEL_2": "Company | Product News",
"LABEL_3": "Treasuries | Corporate Debt",
"LABEL_4": "Dividend",
"LABEL_5": "Earnings",
"LABEL_6": "Energy | Oil",
"LABEL_7": "Financials",
"LABEL_8": "Currencies",
"LABEL_9": "General News | Opinion",
"LABEL_10": "Gold | Metals | Materials",
"LABEL_11": "IPO",
"LABEL_12": "Legal | Regulation",
"LABEL_13": "M&A | Investments",
"LABEL_14": "Macro",
"LABEL_15": "Markets",
"LABEL_16": "Politics",
"LABEL_17": "Personnel Change",
"LABEL_18": "Stock Commentary",
"LABEL_19": "Stock Movement",
}
```
The data was collected using the Twitter API. The current dataset supports the multi-class classification task.
### Task: Topic Classification
# Data Splits
There are 2 splits: train and validation. Below are the statistics:
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 16,990 |
| Validation | 4,118 |
# Licensing Information
The Twitter Financial Dataset (topic) version 1.0.0 is released under the MIT License. |
chloecodes/roi-frames | ---
dataset_info:
features:
- name: image
dtype: image
- name: vid_name
dtype: string
- name: videoID
dtype: string
- name: frameID
dtype: string
- name: label
dtype: float64
- name: confidence
dtype: float64
- name: x_topleft
dtype: float64
- name: y_topleft
dtype: float64
- name: x_bottomright
dtype: float64
- name: y_bottomright
dtype: float64
splits:
- name: train
num_bytes: 12378641.2
num_examples: 1150
download_size: 12432041
dataset_size: 12378641.2
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
GopalGoyal/donut_training | ---
license: apache-2.0
---
|
suolyer/pile_dm-mathematics | ---
license: apache-2.0
---
|
datahrvoje/twitter_dataset_1713177099 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24401
num_examples: 55
download_size: 10915
dataset_size: 24401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nikchar/retrieved_claims_test | ---
dataset_info:
features:
- name: label
dtype: string
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: retrieved_evidence
sequence: string
- name: retrieval_score
sequence: float64
- name: id
dtype: string
- name: text
dtype: string
- name: lines
dtype: string
splits:
- name: train
num_bytes: 6050543
num_examples: 1500
download_size: 2972631
dataset_size: 6050543
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieved_claims_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umoubuton/m4singer | ---
license: mit
---
|
orhunc/Bias-Evaluation-Turkish | ---
language:
- tr
---
Translation of bias evaluation framework of May et al. (2019) from [this repository](https://github.com/W4ngatang/sent-bias) and [this paper](https://arxiv.org/abs/1903.10561) into Turkish. There is a total of 37 tests including tests addressing gender-bias as well as tests designed to evaluate the ethnic bias toward Kurdish people in Tรผrkiye context.
Abstract of the paper:
While the growing size of pre-trained language models has led to large improvements in a variety of natural language processing tasks, the success of these models comes with a price: They are trained on drastic amounts of mostly Web-based data, which often contains social stereotypes and biases that the models might pick up. This can have negative consequences, as models can abuse these biases in downstream tasks or applications. An application exemplifying the embedded cultural stereotypes is statistical machine translation, a common natural language processing task. Translations to English from a gender-neutral language such as Turkish, which does not have any grammatical gender like the gendered pronouns 'he' or 'she' in English, lead to gender-stereotyped sentences. For instance, Google Translate converts these Turkish sentences with gender-neutral pronouns: 'O bir doktor. O bir hemลire.' to these English sentences: 'He is a doctor. She is a nurse.' The same behavior can be observed when translating these Turkish sentences into other languages with grammatical gender like Spanish, Russian, and German. The gender-neutral Turkish pronoun 'o' is converted into gender-stereotyped pronouns in the respective language. Mitigating different types of bias in LMs would have diverse implications: On the one hand, it would allow us to avoid amplifying these biases. On the other hand, by avoiding algorithms enforcing social biases against minorities one could shift the social balance in the long term.
Previous research has primarily focused on the English language, especially in the realm of gender bias in language models. However, the investigation of more languages with different linguistic elements than English, especially the ones like Turkish that are grammatically gender-neutral, can deepen our insights into the role of gender bias in LMs. The goal of this thesis was to address this research gap and to investigate the significance of gender-bias in Turkish language models. We used existing bias evaluation frameworks on Turkish models by both translating existing English datasets and creating new ones designed to measure gender-bias in the context of Tรผrkiye. We also extended the testing framework to evaluate Turkish models for their embedded ethnic bias toward Kurdish people. Based on the test outcomes, we suggested possible relations of the picked up biases to different model characteristics such as the model size, their multilingualism, and the training corpora. |
zhangshuoming/math_23k_train_numeric_double | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 21505369.00691812
num_examples: 21086
download_size: 2785918
dataset_size: 21505369.00691812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "math_23k_train_numeric_double"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eduardovco/edu1 | ---
license: openrail
---
|
Davincilee/closure_system_door_inner | ---
license: lgpl-3.0
---
|
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xxl_mode_T_SPECIFIC_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 108332
num_examples: 1880
download_size: 18161
dataset_size: 108332
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xxl_mode_T_SPECIFIC_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigcode/commitpackft | ---
license: mit
pretty_name: CommitPackFT
language:
- code
---

# Dataset Card for CommitPackFT
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/bigcode-project/octopack
- **Paper:** [OctoPack: Instruction Tuning Code Large Language Models](https://arxiv.org/abs/2308.07124)
- **Point of Contact:** [Niklas Muennighoff](mailto:n.muennighoff@gmail.com)
### Dataset Summary
> CommitPackFT is a 2GB filtered version of [CommitPack](https://huggingface.co/datasets/bigcode/commitpack) to contain only high-quality commit messages that resemble natural language instructions.
>
- **Creation:** The dataset can be recreated using instructions available [here](https://github.com/bigcode-project/octopack).
- **Languages:** 277
- **OctoPack๐๐:**
<table>
<tr>
<th>Data</t>
<td><a href=https://huggingface.co/datasets/bigcode/commitpack>CommitPack</a></td>
<td>4TB of GitHub commits across 350 programming languages</td>
</tr>
<tr>
<th></t>
<td><a href=https://huggingface.co/datasets/bigcode/commitpackft>CommitPackFT</a></td>
<td>Filtered version of CommitPack for high-quality commit messages that resemble instructions</td>
</tr>
<tr>
<th>Model</t>
<td><a href=https://huggingface.co/bigcode/octocoder>OctoCoder</a></td>
<td>StarCoder (16B parameters) instruction tuned on CommitPackFT + OASST</td>
</tr>
<tr>
<th></t>
<td><a href=https://huggingface.co/bigcode/octogeex>OctoGeeX</a></td>
<td>CodeGeeX2 (6B parameters) instruction tuned on CommitPackFT + OASST</td>
</tr>
<tr>
<th>Evaluation </t>
<td><a href=https://huggingface.co/datasets/bigcode/humanevalpack>HumanEvalPack</a></td>
<td>Extension of OpenAI's HumanEval to cover 3 scenarios across 6 languages</td>
</tr>
</table>
## Dataset Structure
### Data Instances
An example looks as follows:
```json
{
'commit': '0c17311f7fd511f5dae8f8e4acc2dce1a2de3cf5',
'old_file': 'main.py',
'new_file': 'main.py',
'old_contents': "import numpy as np\nimport matplotlib.pyplot as plt\n\n# generate sample data\nx_data = np.linspace(-5, 5, 20)\ny_data = np.random.normal(0.0, 1.0, x_data.size)\n\nplt.plot(x_data, y_data, 'o')\nplt.show()\n",
'new_contents': "import math\nimport numpy as np\nimport matplotlib.pyplot as plt\n\n# generate sample data\nx_data = np.linspace(-math.pi, math.pi, 30)\ny_data = np.sin(x_data) + np.random.normal(0.0, 0.1, x_data.size)\n\nplt.plot(x_data, y_data, 'o')\nplt.show()\n\n",
'subject': 'Change to sin() function with noise',
'message': 'Change to sin() function with noise\n',
'lang': 'Python',
'license': 'mit',
'repos': 'MorganR/basic-gaussian-process'
}
```
### Data Fields
The data fields are the same among all splits:
- `commit`: unique commit id
- `old_file`: name of the file before the commit
- `new_file`: name of the file after the commit
- `old_contents`: contents of the file before the commit
- `new_contents`: contents of the file after the commit
- `subject`: subject of the commit (this is used for all experiments in the paper)
- `message`: message of the commit (commonly the same as the subject)
- `lang`: programming language
- `license`: license of the repository the code stems from, one of `['mit', 'artistic-2.0', 'isc', 'cc0-1.0', 'epl-1.0', 'mpl-2.0', 'unlicense', 'unknown', 'apache-2.0', 'bsd-3-clause', 'agpl-3.0', 'lgpl-2.1', 'bsd-2-clause']`
- `repos`: name of the the repository the code stems from (if multiple, they are comma separated)
### Data Splits
| Name | Megabytes | % of total | Samples | % of total |
| --- | --- | --- | --- | --- |
| total | 1545.02 | 100.0% | 702062 | 100.0% |
| ruby | 195.292 | 12.6401% | 69413 | 9.887% |
| yaml | 190.876 | 12.3543% | 114320 | 16.2835% |
| python | 132.68 | 8.5876% | 56025 | 7.9801% |
| markdown | 131.152 | 8.4887% | 62518 | 8.9049% |
| javascript | 125.008 | 8.091% | 52989 | 7.5476% |
| json | 86.744 | 5.6144% | 39777 | 5.6657% |
| shell | 66.864 | 4.3277% | 31217 | 4.4465% |
| text | 66.664 | 4.3148% | 46588 | 6.6359% |
| php | 60.22 | 3.8977% | 24791 | 3.5312% |
| java | 56.284 | 3.6429% | 20635 | 2.9392% |
| html | 48.42 | 3.1339% | 20214 | 2.8792% |
| c# | 26.84 | 1.7372% | 9346 | 1.3312% |
| xml | 23.676 | 1.5324% | 9337 | 1.3299% |
| html+erb | 23.104 | 1.4954% | 10910 | 1.554% |
| c | 21.08 | 1.3644% | 8506 | 1.2116% |
| ini | 21.04 | 1.3618% | 11360 | 1.6181% |
| coffeescript | 16.96 | 1.0977% | 5513 | 0.7853% |
| swift | 16.272 | 1.0532% | 4849 | 0.6907% |
| restructuredtext | 15.728 | 1.018% | 6560 | 0.9344% |
| typescript | 14.284 | 0.9245% | 5868 | 0.8358% |
| c++ | 14.136 | 0.9149% | 4992 | 0.711% |
| scss | 13.208 | 0.8549% | 6829 | 0.9727% |
| go | 12.132 | 0.7852% | 5004 | 0.7128% |
| scala | 11.184 | 0.7239% | 5040 | 0.7179% |
| haml | 10.74 | 0.6951% | 4415 | 0.6289% |
| css | 9.364 | 0.6061% | 5049 | 0.7192% |
| rust | 7.244 | 0.4689% | 2996 | 0.4267% |
| toml | 5.584 | 0.3614% | 3424 | 0.4877% |
| jsx | 5.5 | 0.356% | 2199 | 0.3132% |
| kotlin | 5.368 | 0.3474% | 2214 | 0.3154% |
| clojure | 5.068 | 0.328% | 2403 | 0.3423% |
| perl | 4.988 | 0.3228% | 2288 | 0.3259% |
| bitbake | 4.464 | 0.2889% | 1308 | 0.1863% |
| groovy | 4.168 | 0.2698% | 1486 | 0.2117% |
| twig | 3.956 | 0.256% | 1610 | 0.2293% |
| nix | 3.84 | 0.2485% | 1593 | 0.2269% |
| sql | 3.74 | 0.2421% | 2069 | 0.2947% |
| less | 3.724 | 0.241% | 1360 | 0.1937% |
| haskell | 3.308 | 0.2141% | 1389 | 0.1978% |
| handlebars | 3.292 | 0.2131% | 1429 | 0.2035% |
| unknown | 3.048 | 0.1973% | 1597 | 0.2275% |
| batchfile | 2.984 | 0.1931% | 1466 | 0.2088% |
| cucumber | 2.588 | 0.1675% | 976 | 0.139% |
| makefile | 2.528 | 0.1636% | 960 | 0.1367% |
| elixir | 2.348 | 0.152% | 1150 | 0.1638% |
| jade | 2.348 | 0.152% | 1119 | 0.1594% |
| cmake | 2.268 | 0.1468% | 981 | 0.1397% |
| powershell | 2.064 | 0.1336% | 991 | 0.1412% |
| slim | 2.056 | 0.1331% | 1052 | 0.1498% |
| emacs-lisp | 1.972 | 0.1276% | 1015 | 0.1446% |
| dart | 1.96 | 0.1269% | 765 | 0.109% |
| viml | 1.956 | 0.1266% | 1063 | 0.1514% |
| asciidoc | 1.864 | 0.1206% | 523 | 0.0745% |
| lua | 1.852 | 0.1199% | 920 | 0.131% |
| llvm | 1.6 | 0.1036% | 780 | 0.1111% |
| smarty | 1.588 | 0.1028% | 737 | 0.105% |
| diff | 1.48 | 0.0958% | 680 | 0.0969% |
| common-lisp | 1.448 | 0.0937% | 778 | 0.1108% |
| saltstack | 1.412 | 0.0914% | 617 | 0.0879% |
| vue | 1.384 | 0.0896% | 587 | 0.0836% |
| sass | 1.364 | 0.0883% | 705 | 0.1004% |
| fish | 1.328 | 0.086% | 813 | 0.1158% |
| erlang | 1.192 | 0.0772% | 480 | 0.0684% |
| freemarker | 1.028 | 0.0665% | 510 | 0.0726% |
| stylus | 0.948 | 0.0614% | 480 | 0.0684% |
| qml | 0.936 | 0.0606% | 368 | 0.0524% |
| hcl | 0.912 | 0.059% | 421 | 0.06% |
| html+django | 0.848 | 0.0549% | 399 | 0.0568% |
| mako | 0.756 | 0.0489% | 170 | 0.0242% |
| ada | 0.728 | 0.0471% | 265 | 0.0377% |
| ocaml | 0.704 | 0.0456% | 333 | 0.0474% |
| f# | 0.656 | 0.0425% | 254 | 0.0362% |
| elm | 0.62 | 0.0401% | 265 | 0.0377% |
| tex | 0.564 | 0.0365% | 307 | 0.0437% |
| rdoc | 0.552 | 0.0357% | 270 | 0.0385% |
| csv | 0.532 | 0.0344% | 375 | 0.0534% |
| protocol-buffer | 0.524 | 0.0339% | 181 | 0.0258% |
| smalltalk | 0.46 | 0.0298% | 284 | 0.0405% |
| arduino | 0.456 | 0.0295% | 225 | 0.032% |
| java-server-pages | 0.452 | 0.0293% | 173 | 0.0246% |
| scheme | 0.42 | 0.0272% | 213 | 0.0303% |
| groff | 0.396 | 0.0256% | 192 | 0.0273% |
| objective-c++ | 0.376 | 0.0243% | 86 | 0.0122% |
| desktop | 0.364 | 0.0236% | 186 | 0.0265% |
| factor | 0.356 | 0.023% | 113 | 0.0161% |
| crystal | 0.348 | 0.0225% | 182 | 0.0259% |
| rhtml | 0.348 | 0.0225% | 135 | 0.0192% |
| haxe | 0.344 | 0.0223% | 174 | 0.0248% |
| glsl | 0.34 | 0.022% | 164 | 0.0234% |
| gas | 0.336 | 0.0217% | 193 | 0.0275% |
| html+php | 0.332 | 0.0215% | 150 | 0.0214% |
| qmake | 0.32 | 0.0207% | 140 | 0.0199% |
| julia | 0.312 | 0.0202% | 180 | 0.0256% |
| cython | 0.308 | 0.0199% | 123 | 0.0175% |
| html+eex | 0.292 | 0.0189% | 135 | 0.0192% |
| tcl | 0.292 | 0.0189% | 103 | 0.0147% |
| org | 0.272 | 0.0176% | 136 | 0.0194% |
| perl6 | 0.268 | 0.0173% | 122 | 0.0174% |
| m4 | 0.264 | 0.0171% | 101 | 0.0144% |
| xslt | 0.256 | 0.0166% | 99 | 0.0141% |
| svg | 0.252 | 0.0163% | 169 | 0.0241% |
| nimrod | 0.236 | 0.0153% | 67 | 0.0095% |
| r | 0.228 | 0.0148% | 121 | 0.0172% |
| robotframework | 0.212 | 0.0137% | 85 | 0.0121% |
| racket | 0.196 | 0.0127% | 117 | 0.0167% |
| textile | 0.184 | 0.0119% | 61 | 0.0087% |
| assembly | 0.172 | 0.0111% | 105 | 0.015% |
| purescript | 0.172 | 0.0111% | 80 | 0.0114% |
| unity3d-asset | 0.156 | 0.0101% | 101 | 0.0144% |
| visual-basic | 0.152 | 0.0098% | 48 | 0.0068% |
| dm | 0.148 | 0.0096% | 16 | 0.0023% |
| pod | 0.148 | 0.0096% | 54 | 0.0077% |
| standard-ml | 0.148 | 0.0096% | 72 | 0.0103% |
| fortran | 0.144 | 0.0093% | 70 | 0.01% |
| gettext-catalog | 0.132 | 0.0085% | 72 | 0.0103% |
| idris | 0.132 | 0.0085% | 38 | 0.0054% |
| livescript | 0.128 | 0.0083% | 63 | 0.009% |
| xtend | 0.128 | 0.0083% | 55 | 0.0078% |
| actionscript | 0.12 | 0.0078% | 49 | 0.007% |
| vala | 0.116 | 0.0075% | 50 | 0.0071% |
| awk | 0.104 | 0.0067% | 52 | 0.0074% |
| ceylon | 0.1 | 0.0065% | 49 | 0.007% |
| jupyter-notebook | 0.1 | 0.0065% | 48 | 0.0068% |
| dockerfile | 0.096 | 0.0062% | 39 | 0.0056% |
| rouge | 0.096 | 0.0062% | 41 | 0.0058% |
| asp | 0.092 | 0.006% | 22 | 0.0031% |
| sqf | 0.092 | 0.006% | 45 | 0.0064% |
| edn | 0.088 | 0.0057% | 48 | 0.0068% |
| liquid | 0.088 | 0.0057% | 30 | 0.0043% |
| xquery | 0.084 | 0.0054% | 39 | 0.0056% |
| linker-script | 0.08 | 0.0052% | 37 | 0.0053% |
| mediawiki | 0.08 | 0.0052% | 33 | 0.0047% |
| parrot-internal-representation | 0.08 | 0.0052% | 23 | 0.0033% |
| solidity | 0.08 | 0.0052% | 37 | 0.0053% |
| json5 | 0.076 | 0.0049% | 33 | 0.0047% |
| systemverilog | 0.076 | 0.0049% | 35 | 0.005% |
| thrift | 0.076 | 0.0049% | 28 | 0.004% |
| groovy-server-pages | 0.072 | 0.0047% | 25 | 0.0036% |
| processing | 0.072 | 0.0047% | 35 | 0.005% |
| cuda | 0.068 | 0.0044% | 25 | 0.0036% |
| graphviz-dot | 0.068 | 0.0044% | 35 | 0.005% |
| inno-setup | 0.064 | 0.0041% | 16 | 0.0023% |
| api-blueprint | 0.06 | 0.0039% | 23 | 0.0033% |
| nsis | 0.06 | 0.0039% | 15 | 0.0021% |
| gentoo-ebuild | 0.056 | 0.0036% | 16 | 0.0023% |
| logtalk | 0.056 | 0.0036% | 21 | 0.003% |
| jasmin | 0.052 | 0.0034% | 9 | 0.0013% |
| literate-coffeescript | 0.052 | 0.0034% | 19 | 0.0027% |
| webidl | 0.052 | 0.0034% | 6 | 0.0009% |
| coldfusion-cfc | 0.048 | 0.0031% | 20 | 0.0028% |
| opencl | 0.048 | 0.0031% | 23 | 0.0033% |
| openscad | 0.048 | 0.0031% | 21 | 0.003% |
| pan | 0.048 | 0.0031% | 23 | 0.0033% |
| pascal | 0.048 | 0.0031% | 25 | 0.0036% |
| pony | 0.048 | 0.0031% | 16 | 0.0023% |
| turtle | 0.048 | 0.0031% | 21 | 0.003% |
| chapel | 0.044 | 0.0028% | 20 | 0.0028% |
| ioke | 0.044 | 0.0028% | 25 | 0.0036% |
| ooc | 0.044 | 0.0028% | 15 | 0.0021% |
| sparql | 0.044 | 0.0028% | 23 | 0.0033% |
| applescript | 0.04 | 0.0026% | 19 | 0.0027% |
| augeas | 0.04 | 0.0026% | 13 | 0.0019% |
| g-code | 0.04 | 0.0026% | 7 | 0.001% |
| mirah | 0.04 | 0.0026% | 16 | 0.0023% |
| capn-proto | 0.036 | 0.0023% | 12 | 0.0017% |
| digital-command-language | 0.036 | 0.0023% | 19 | 0.0027% |
| hy | 0.036 | 0.0023% | 12 | 0.0017% |
| logos | 0.036 | 0.0023% | 19 | 0.0027% |
| modelica | 0.036 | 0.0023% | 15 | 0.0021% |
| vcl | 0.036 | 0.0023% | 18 | 0.0026% |
| antlr | 0.032 | 0.0021% | 15 | 0.0021% |
| gdscript | 0.032 | 0.0021% | 9 | 0.0013% |
| graphql | 0.032 | 0.0021% | 17 | 0.0024% |
| hlsl | 0.032 | 0.0021% | 11 | 0.0016% |
| gnuplot | 0.028 | 0.0018% | 17 | 0.0024% |
| http | 0.028 | 0.0018% | 19 | 0.0027% |
| ninja | 0.028 | 0.0018% | 14 | 0.002% |
| oz | 0.028 | 0.0018% | 8 | 0.0011% |
| raml | 0.028 | 0.0018% | 9 | 0.0013% |
| aspectj | 0.024 | 0.0016% | 8 | 0.0011% |
| autohotkey | 0.024 | 0.0016% | 15 | 0.0021% |
| fancy | 0.024 | 0.0016% | 8 | 0.0011% |
| moonscript | 0.024 | 0.0016% | 10 | 0.0014% |
| piglatin | 0.024 | 0.0016% | 11 | 0.0016% |
| stata | 0.024 | 0.0016% | 10 | 0.0014% |
| urweb | 0.024 | 0.0016% | 6 | 0.0009% |
| xs | 0.024 | 0.0016% | 7 | 0.001% |
| yang | 0.024 | 0.0016% | 6 | 0.0009% |
| agda | 0.02 | 0.0013% | 10 | 0.0014% |
| coldfusion | 0.02 | 0.0013% | 9 | 0.0013% |
| emberscript | 0.02 | 0.0013% | 7 | 0.001% |
| latte | 0.02 | 0.0013% | 7 | 0.001% |
| literate-haskell | 0.02 | 0.0013% | 7 | 0.001% |
| postscript | 0.02 | 0.0013% | 9 | 0.0013% |
| scilab | 0.02 | 0.0013% | 10 | 0.0014% |
| tcsh | 0.02 | 0.0013% | 10 | 0.0014% |
| volt | 0.02 | 0.0013% | 9 | 0.0013% |
| apl | 0.016 | 0.001% | 7 | 0.001% |
| genshi | 0.016 | 0.001% | 3 | 0.0004% |
| jsonld | 0.016 | 0.001% | 6 | 0.0009% |
| krl | 0.016 | 0.001% | 4 | 0.0006% |
| lean | 0.016 | 0.001% | 3 | 0.0004% |
| lfe | 0.016 | 0.001% | 6 | 0.0009% |
| metal | 0.016 | 0.001% | 4 | 0.0006% |
| monkey | 0.016 | 0.001% | 4 | 0.0006% |
| mupad | 0.016 | 0.001% | 4 | 0.0006% |
| nesc | 0.016 | 0.001% | 7 | 0.001% |
| nit | 0.016 | 0.001% | 3 | 0.0004% |
| pike | 0.016 | 0.001% | 6 | 0.0009% |
| purebasic | 0.016 | 0.001% | 5 | 0.0007% |
| renpy | 0.016 | 0.001% | 3 | 0.0004% |
| vhdl | 0.016 | 0.001% | 5 | 0.0007% |
| xproc | 0.016 | 0.001% | 3 | 0.0004% |
| zephir | 0.016 | 0.001% | 4 | 0.0006% |
| apacheconf | 0.012 | 0.0008% | 2 | 0.0003% |
| boo | 0.012 | 0.0008% | 2 | 0.0003% |
| brainfuck | 0.012 | 0.0008% | 2 | 0.0003% |
| bro | 0.012 | 0.0008% | 3 | 0.0004% |
| cartocss | 0.012 | 0.0008% | 3 | 0.0004% |
| creole | 0.012 | 0.0008% | 2 | 0.0003% |
| csound | 0.012 | 0.0008% | 4 | 0.0006% |
| dylan | 0.012 | 0.0008% | 2 | 0.0003% |
| eagle | 0.012 | 0.0008% | 4 | 0.0006% |
| ecl | 0.012 | 0.0008% | 4 | 0.0006% |
| eiffel | 0.012 | 0.0008% | 2 | 0.0003% |
| flux | 0.012 | 0.0008% | 3 | 0.0004% |
| io | 0.012 | 0.0008% | 4 | 0.0006% |
| jsoniq | 0.012 | 0.0008% | 6 | 0.0009% |
| lilypond | 0.012 | 0.0008% | 6 | 0.0009% |
| lsl | 0.012 | 0.0008% | 3 | 0.0004% |
| mask | 0.012 | 0.0008% | 4 | 0.0006% |
| nginx | 0.012 | 0.0008% | 2 | 0.0003% |
| nu | 0.012 | 0.0008% | 2 | 0.0003% |
| pov-ray-sdl | 0.012 | 0.0008% | 5 | 0.0007% |
| ragel-in-ruby-host | 0.012 | 0.0008% | 4 | 0.0006% |
| slash | 0.012 | 0.0008% | 4 | 0.0006% |
| sourcepawn | 0.012 | 0.0008% | 3 | 0.0004% |
| squirrel | 0.012 | 0.0008% | 4 | 0.0006% |
| ston | 0.012 | 0.0008% | 6 | 0.0009% |
| uno | 0.012 | 0.0008% | 2 | 0.0003% |
| wisp | 0.012 | 0.0008% | 3 | 0.0004% |
| xbase | 0.012 | 0.0008% | 3 | 0.0004% |
| yacc | 0.012 | 0.0008% | 3 | 0.0004% |
| zig | 0.012 | 0.0008% | 4 | 0.0006% |
| abap | 0.008 | 0.0005% | 1 | 0.0001% |
| arc | 0.008 | 0.0005% | 2 | 0.0003% |
| ats | 0.008 | 0.0005% | 3 | 0.0004% |
| blitzmax | 0.008 | 0.0005% | 1 | 0.0001% |
| bluespec | 0.008 | 0.0005% | 2 | 0.0003% |
| c2hs-haskell | 0.008 | 0.0005% | 2 | 0.0003% |
| clean | 0.008 | 0.0005% | 1 | 0.0001% |
| dns-zone | 0.008 | 0.0005% | 2 | 0.0003% |
| forth | 0.008 | 0.0005% | 2 | 0.0003% |
| harbour | 0.008 | 0.0005% | 1 | 0.0001% |
| igor-pro | 0.008 | 0.0005% | 1 | 0.0001% |
| inform-7 | 0.008 | 0.0005% | 2 | 0.0003% |
| isabelle | 0.008 | 0.0005% | 2 | 0.0003% |
| jflex | 0.008 | 0.0005% | 1 | 0.0001% |
| literate-agda | 0.008 | 0.0005% | 1 | 0.0001% |
| maple | 0.008 | 0.0005% | 2 | 0.0003% |
| mathematica | 0.008 | 0.0005% | 1 | 0.0001% |
| module-management-system | 0.008 | 0.0005% | 1 | 0.0001% |
| mtml | 0.008 | 0.0005% | 2 | 0.0003% |
| netlinx | 0.008 | 0.0005% | 1 | 0.0001% |
| parrot-assembly | 0.008 | 0.0005% | 2 | 0.0003% |
| pawn | 0.008 | 0.0005% | 3 | 0.0004% |
| propeller-spin | 0.008 | 0.0005% | 1 | 0.0001% |
| pure-data | 0.008 | 0.0005% | 1 | 0.0001% |
| rebol | 0.008 | 0.0005% | 3 | 0.0004% |
| red | 0.008 | 0.0005% | 1 | 0.0001% |
| sage | 0.008 | 0.0005% | 1 | 0.0001% |
| sas | 0.008 | 0.0005% | 1 | 0.0001% |
| scaml | 0.008 | 0.0005% | 1 | 0.0001% |
| smt | 0.008 | 0.0005% | 3 | 0.0004% |
| supercollider | 0.008 | 0.0005% | 2 | 0.0003% |
| unrealscript | 0.008 | 0.0005% | 1 | 0.0001% |
| xpages | 0.008 | 0.0005% | 1 | 0.0001% |
## Additional Information
### Licensing Information
Each sample comes from a code repository with a permissive license. The license is provided by the `license` field for each sample.
### Citation Information
```bibtex
@article{muennighoff2023octopack,
title={OctoPack: Instruction Tuning Code Large Language Models},
author={Niklas Muennighoff and Qian Liu and Armel Zebaze and Qinkai Zheng and Binyuan Hui and Terry Yue Zhuo and Swayam Singh and Xiangru Tang and Leandro von Werra and Shayne Longpre},
journal={arXiv preprint arXiv:2308.07124},
year={2023}
}
``` |
collabora/whisperspeech | ---
license: mit
task_categories:
- text-to-speech
language:
- en
pretty_name: WhisperSpeech
---
# The WhisperSpeech Dataset
This dataset contains data to train SPEAR TTS-like text-to-speech models that utilized semantic tokens derived from the OpenAI Whisper
speech recognition model.
We currently provide semantic and acoustic tokens for the LibriLight and LibriTTS datasets (English only).
Acoustic tokens:
- 24kHz EnCodec 6kbps (8 quantizers)
Semantic tokens:
- Whisper tiny VQ bottleneck trained on a subset of LibriLight
Available LibriLight subsets:
- `small`/`medium`/`large` (following the original dataset division but with `large` excluding the speaker `6454`)
- a separate โ1300hr single-speaker subset based on the `6454` speaker from the `large` subset for training single-speaker TTS models
We plan to add more acoustic tokens from other codecs in the future. |
maghwa/OpenHermes-2-AR-10K-8 | ---
dataset_info:
features:
- name: language
dtype: 'null'
- name: topic
dtype: 'null'
- name: conversations
dtype: string
- name: model
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: views
dtype: float64
- name: id
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: model_name
dtype: 'null'
- name: source
dtype: string
- name: hash
dtype: 'null'
- name: idx
dtype: 'null'
- name: title
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: category
dtype: 'null'
splits:
- name: train
num_bytes: 19902046
num_examples: 10001
download_size: 8600468
dataset_size: 19902046
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/honda_roko_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of honda_roko (THE iDOLM@STER: Million Live!)
This is the dataset of honda_roko (THE iDOLM@STER: Million Live!), containing 40 images and their tags.
The core tags of this character are `long_hair, bow, yellow_eyes, hair_bow, breasts, grey_hair, twintails, bangs, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 42.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honda_roko_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 28.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honda_roko_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 84 | 54.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honda_roko_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 38.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honda_roko_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 84 | 69.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honda_roko_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/honda_roko_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, open_mouth, navel, :d, nipples, nude, small_breasts, hair_ornament, hat, jewelry, pussy |
| 1 | 5 |  |  |  |  |  | 1girl, 1boy, blush, hetero, penis, solo_focus, sweat, looking_at_viewer, mosaic_censoring, nipples, open_clothes, open_mouth, spread_legs, thighhighs, after_sex, bra, clothed_sex, cum_in_pussy, large_breasts, lying, m_legs, panties, polka_dot, smile, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | open_mouth | navel | :d | nipples | nude | small_breasts | hair_ornament | hat | jewelry | pussy | 1boy | hetero | penis | solo_focus | sweat | mosaic_censoring | open_clothes | spread_legs | thighhighs | after_sex | bra | clothed_sex | cum_in_pussy | large_breasts | lying | m_legs | panties | polka_dot | smile | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------------|:--------|:-----|:----------|:-------|:----------------|:----------------|:------|:----------|:--------|:-------|:---------|:--------|:-------------|:--------|:-------------------|:---------------|:--------------|:-------------|:------------|:------|:--------------|:---------------|:----------------|:--------|:---------|:----------|:------------|:--------|:----------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
eyewashere/testing | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_teknium__OpenHermes-13B | ---
pretty_name: Evaluation run of teknium/OpenHermes-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [teknium/OpenHermes-13B](https://huggingface.co/teknium/OpenHermes-13B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__OpenHermes-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T20:23:56.851767](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-13B/blob/main/results_2023-10-24T20-23-56.851767.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003984899328859061,\n\
\ \"em_stderr\": 0.0006451805848102473,\n \"f1\": 0.06597944630872499,\n\
\ \"f1_stderr\": 0.0014689416324005639,\n \"acc\": 0.4352676233998515,\n\
\ \"acc_stderr\": 0.010457879214313065\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003984899328859061,\n \"em_stderr\": 0.0006451805848102473,\n\
\ \"f1\": 0.06597944630872499,\n \"f1_stderr\": 0.0014689416324005639\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \
\ \"acc_stderr\": 0.008820485491442487\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183644\n\
\ }\n}\n```"
repo_url: https://huggingface.co/teknium/OpenHermes-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|arc:challenge|25_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|arc:challenge|25_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T20_23_56.851767
path:
- '**/details_harness|drop|3_2023-10-24T20-23-56.851767.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T20-23-56.851767.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T20_23_56.851767
path:
- '**/details_harness|gsm8k|5_2023-10-24T20-23-56.851767.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T20-23-56.851767.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hellaswag|10_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hellaswag|10_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T01-56-57.835904.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T01-56-57.835904.parquet'
- split: 2023_09_13T02_06_09.559271
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T02-06-09.559271.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T02-06-09.559271.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T20_23_56.851767
path:
- '**/details_harness|winogrande|5_2023-10-24T20-23-56.851767.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T20-23-56.851767.parquet'
- config_name: results
data_files:
- split: 2023_09_13T01_56_57.835904
path:
- results_2023-09-13T01-56-57.835904.parquet
- split: 2023_09_13T02_06_09.559271
path:
- results_2023-09-13T02-06-09.559271.parquet
- split: 2023_10_24T20_23_56.851767
path:
- results_2023-10-24T20-23-56.851767.parquet
- split: latest
path:
- results_2023-10-24T20-23-56.851767.parquet
---
# Dataset Card for Evaluation run of teknium/OpenHermes-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/OpenHermes-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/OpenHermes-13B](https://huggingface.co/teknium/OpenHermes-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__OpenHermes-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T20:23:56.851767](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-13B/blob/main/results_2023-10-24T20-23-56.851767.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003984899328859061,
"em_stderr": 0.0006451805848102473,
"f1": 0.06597944630872499,
"f1_stderr": 0.0014689416324005639,
"acc": 0.4352676233998515,
"acc_stderr": 0.010457879214313065
},
"harness|drop|3": {
"em": 0.003984899328859061,
"em_stderr": 0.0006451805848102473,
"f1": 0.06597944630872499,
"f1_stderr": 0.0014689416324005639
},
"harness|gsm8k|5": {
"acc": 0.11599696739954511,
"acc_stderr": 0.008820485491442487
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183644
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
daje/en_wiki | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15215273427
num_examples: 5091142
download_size: 8903954435
dataset_size: 15215273427
---
# ์์ด ์ํค ๋ฐ์ดํฐ์
(En_wiki)
* ๊ฐ์
- ์ด ๋ฐ์ดํฐ์
์ ์์ด ์ํค ๋ฐ์ดํฐ๋ฅผ ๊ธฐ๋ฐ์ผ๋ก ๋ง๋ค์ด์ก์ต๋๋ค. ์๋ณธ ์ํค ๋ฐ์ดํฐ๋ฅผ ์ฒ๋ฆฌํ๊ธฐ ์ํด wikiextractor.py๋ฅผ ์ฌ์ฉํ์ฌ ํ
์คํธ ํ์์ผ๋ก ๋ณํํ์์ต๋๋ค.
- ์ด ๋ฐ์ดํฐ์
์ ์ ์ํ ์ฃผ์ ์ทจ์ง๋ ์์ด ์์ฐ์ด ์ฒ๋ฆฌ ์ฐ๊ตฌ์ ์ ํ๋ฆฌ์ผ์ด์
๊ฐ๋ฐ์ ์ฌ์ฉํ ์ ์๋ ๊ด๋ฒ์ํ ํ
์คํธ ๋ฐ์ดํฐ๋ฅผ ์ ๊ณตํ๊ธฐ ์ํจ์
๋๋ค.
- dataset map์ ์ฌ์ฉํ์ค๋๋ ๋ฐ๋์ streaming=True๋ฅผ ์ฌ์ฉํ์
์ผ ํฉ๋๋ค. ์ปดํจํ
ํ์๊ฐ ์์ฒญ ์ข์ง ์๋ค๋ฉด, ๋จ์ด ํฐ์ง ์ ์์ต๋๋ค.
* ๋ฐ์ดํฐ ๊ตฌ์กฐ
- text: ์ํค ๋ฌธ์์ ๋ณธ๋ฌธ์ ํฌํจํ๋ ๋ฌธ์์ด์
๋๋ค.
* ์ฌ์ฉ ๋ฐฉ๋ฒ
1. huggingface dataset๊ณผ map์ ํ์ฉํ๋ ๋ฐฉ๋ฒ
```python3
from datasets import load_dataset
ko_dataset = load_dataset("text",
"daje/en_wiki",
split="train",
streaming=True)
ko_wiki_tokenized = ko_dataset.map(lambda x : tokenizer(x["text"],
max_length=256,
padding="max_length",
truncation=True),
remove_columns=["text"])
```
2. ํ์ด์ฌ ์คํฌ๋ฆฝํธ๋ฅผ ์ฌ์ฉํ๋ ๋ฐฉ๋ฒ
```
import os
from tqdm import tqdm
from transformers import AutoTokenizer
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('--input_path', type=str)
parser.add_argument('--output_path', type=str)
parser.add_argument('--model_name_or_path', type=str)
parser.add_argument('--max_seq_length', type=int, default=256)
parser.add_argument('--add_sep', default=True, action='store_true')
args = parser.parse_args()
def get_num_lines(fname):
res = os.popen(f'wc -l {fname}').read()
lines = res.strip().split()[0]
return int(lines)
def main(args):
seq_length = args.max_seq_length - 3 # room for [BOS], [EOS], [UNK]
input_fs = open(args.input_path, 'r')
output_fs = open(args.output_path, 'a')
total_line = get_num_lines(args.input_path)
tokenizer = AutoTokenizer.from_pretrained(args.model_name_or_path)
buffer = []
for doc in tqdm(input_fs, total=total_line):
tokens = tokenizer.tokenize(doc)
buffer += tokens
if args.add_sep:
buffer += [tokenizer.eos_token] # ์์ ์ด ์ฌ์ฉํ๋ tokenizer์ ๋ง์ถ์ด์ eos, sep์ ๋ฃ์ผ์๋ฉด ๋ฉ๋๋ค.
while len(buffer) > seq_length:
text = ' '.join(buffer[:seq_length])
output_fs.write(text)
output_fs.write('\n')
buffer = buffer[seq_length:]
input_fs.close()
output_fs.close()
if __name__ == '__main__':
main(args)
```
|
kings-crown/IsarCodingLearn | ---
license: mit
---
|
thanhduycao/viet_news_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3710939367
num_examples: 1000000
download_size: 1780050101
dataset_size: 3710939367
---
# Dataset Card for "viet_news_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_train300_eval100_v1_docidx | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 34574
num_examples: 300
- name: train_recite_qa
num_bytes: 226733
num_examples: 300
- name: eval_qa
num_bytes: 11254
num_examples: 100
- name: eval_recite_qa
num_bytes: 74768
num_examples: 100
- name: all_docs
num_bytes: 254478
num_examples: 392
- name: all_docs_eval
num_bytes: 254451
num_examples: 392
- name: train
num_bytes: 254478
num_examples: 392
- name: validation
num_bytes: 254451
num_examples: 392
download_size: 894547
dataset_size: 1365187
---
# Dataset Card for "lmind_nq_train300_eval100_v1_docidx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B | ---
pretty_name: Evaluation run of Undi95/MLewd-L2-Chat-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/MLewd-L2-Chat-13B](https://huggingface.co/Undi95/MLewd-L2-Chat-13B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-07T04:02:20.497765](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B_public/blob/main/results_2023-11-07T04-02-20.497765.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.039953859060402684,\n\
\ \"em_stderr\": 0.0020056958276819816,\n \"f1\": 0.12528313758389248,\n\
\ \"f1_stderr\": 0.0025138994037981494,\n \"acc\": 0.44361714795535834,\n\
\ \"acc_stderr\": 0.010234482644867801\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.039953859060402684,\n \"em_stderr\": 0.0020056958276819816,\n\
\ \"f1\": 0.12528313758389248,\n \"f1_stderr\": 0.0025138994037981494\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.008719339028833055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902545\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/MLewd-L2-Chat-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_05T00_36_15.205012
path:
- '**/details_harness|drop|3_2023-11-05T00-36-15.205012.parquet'
- split: 2023_11_07T04_02_20.497765
path:
- '**/details_harness|drop|3_2023-11-07T04-02-20.497765.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-07T04-02-20.497765.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_05T00_36_15.205012
path:
- '**/details_harness|gsm8k|5_2023-11-05T00-36-15.205012.parquet'
- split: 2023_11_07T04_02_20.497765
path:
- '**/details_harness|gsm8k|5_2023-11-07T04-02-20.497765.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-07T04-02-20.497765.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_05T00_36_15.205012
path:
- '**/details_harness|winogrande|5_2023-11-05T00-36-15.205012.parquet'
- split: 2023_11_07T04_02_20.497765
path:
- '**/details_harness|winogrande|5_2023-11-07T04-02-20.497765.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-07T04-02-20.497765.parquet'
- config_name: results
data_files:
- split: 2023_11_05T00_36_15.205012
path:
- results_2023-11-05T00-36-15.205012.parquet
- split: 2023_11_07T04_02_20.497765
path:
- results_2023-11-07T04-02-20.497765.parquet
- split: latest
path:
- results_2023-11-07T04-02-20.497765.parquet
---
# Dataset Card for Evaluation run of Undi95/MLewd-L2-Chat-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MLewd-L2-Chat-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MLewd-L2-Chat-13B](https://huggingface.co/Undi95/MLewd-L2-Chat-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T04:02:20.497765](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B_public/blob/main/results_2023-11-07T04-02-20.497765.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.039953859060402684,
"em_stderr": 0.0020056958276819816,
"f1": 0.12528313758389248,
"f1_stderr": 0.0025138994037981494,
"acc": 0.44361714795535834,
"acc_stderr": 0.010234482644867801
},
"harness|drop|3": {
"em": 0.039953859060402684,
"em_stderr": 0.0020056958276819816,
"f1": 0.12528313758389248,
"f1_stderr": 0.0025138994037981494
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833055
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902545
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/t_cms_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of t_cms/T-CMS/T-CMS (Girls' Frontline)
This is the dataset of t_cms/T-CMS/T-CMS (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are `grey_hair, long_hair, multicolored_hair, streaked_hair, bangs, hair_between_eyes, breasts, purple_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 36.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 14.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 32.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 28.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 56.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/t_cms_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, jacket, fur_trim, goggles_around_neck, coat, off_shoulder, bare_shoulders, black_gloves, black_shorts, open_clothes, holding, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | jacket | fur_trim | goggles_around_neck | coat | off_shoulder | bare_shoulders | black_gloves | black_shorts | open_clothes | holding | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:---------|:-----------|:----------------------|:-------|:---------------|:-----------------|:---------------|:---------------|:---------------|:----------|:--------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
metral/ranobe_sample | ---
license: apache-2.0
language:
- ja
---
# What is this?
This is the text of my novel. It has approximately 240,000 words.
The genre is fantasy light novel.
# What is the licence?
The licence type is Apache 2.0.
# How can I use it?
I want you to use this novel as a sample of Japanese writing.
After that, you are free to use it within the scope of the licence.
You can send me fan letters :)
# Are there any precautions I should be aware of?
This text is still available on Kakuyom. The unique format for its publication has been retained. Please note that some of the formatting, such as ruby and highlighted characters, are not found in normal Japanese texts.
* https://kakuyomu.jp/help/entry/notation
# Others.
If you have any questions, please feel free to contact the HuggingFace community. |
tmnam20/ViNLI | ---
dataset_info:
features:
- name: pairID
dtype: string
- name: gold_label
dtype: string
- name: link
dtype: string
- name: context
dtype: string
- name: sentence1
dtype: string
- name: sentenceID
dtype: string
- name: topic
dtype: string
- name: sentence2
dtype: string
- name: annotator_labels
sequence: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 5171024
num_examples: 2048
- name: test
num_bytes: 577699
num_examples: 232
- name: validation
num_bytes: 590037
num_examples: 232
download_size: 436774
dataset_size: 6338760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
hemakumari/new_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 307669488
num_examples: 207372
- name: test
num_bytes: 76915271
num_examples: 51844
download_size: 22761050
dataset_size: 384584759
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
jacquelinehe/filtered_trivia_qa | ---
dataset_info:
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: validation
num_bytes: 7804835.411836825
num_examples: 9961
download_size: 4043605
dataset_size: 7804835.411836825
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
autoevaluate/autoeval-eval-lener_br-lener_br-d57983-1886264290 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/xlm-roberta-large-finetuned-lener_br-finetuned-lener-br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/xlm-roberta-large-finetuned-lener_br-finetuned-lener-br
* Dataset: lener_br
* Config: lener_br
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
yijuilee/test_qa2 | ---
license: apache-2.0
---
|
jionghong94/GhostBuster_v3 | ---
license: mit
---
|
mask-distilled-one-sec-cv12/chunk_4 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 917277972
num_examples: 180141
download_size: 933036301
dataset_size: 917277972
---
# Dataset Card for "chunk_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B-v1](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:57:20.867230](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1/blob/main/results_2023-10-10T14-57-20.867230.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5221924256666464,\n\
\ \"acc_stderr\": 0.03497779761198706,\n \"acc_norm\": 0.5257525929962562,\n\
\ \"acc_norm_stderr\": 0.03496709701060229,\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.5936287801538656,\n\
\ \"mc2_stderr\": 0.015090925037000012\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5290102389078498,\n \"acc_norm_stderr\": 0.01458677635529431\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5657239593706433,\n\
\ \"acc_stderr\": 0.004946485466544624,\n \"acc_norm\": 0.7467635929097789,\n\
\ \"acc_norm_stderr\": 0.0043397644342190655\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5870967741935483,\n\
\ \"acc_stderr\": 0.028009138125400387,\n \"acc_norm\": 0.5870967741935483,\n\
\ \"acc_norm_stderr\": 0.028009138125400387\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986476,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.710091743119266,\n \"acc_stderr\": 0.0194530666092016,\n \"acc_norm\"\
: 0.710091743119266,\n \"acc_norm_stderr\": 0.0194530666092016\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105307,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\
\ \"acc_stderr\": 0.016203792703197793,\n \"acc_norm\": 0.7113665389527458,\n\
\ \"acc_norm_stderr\": 0.016203792703197793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952236,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952236\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372432,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372432\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.028408302020332694,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.028408302020332694\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854926,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854926\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.030359697079046104,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.030359697079046104\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355044,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.5936287801538656,\n\
\ \"mc2_stderr\": 0.015090925037000012\n }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-57-20.867230.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- results_2023-10-10T14-57-20.867230.parquet
- split: latest
path:
- results_2023-10-10T14-57-20.867230.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B-v1](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:57:20.867230](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1/blob/main/results_2023-10-10T14-57-20.867230.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5221924256666464,
"acc_stderr": 0.03497779761198706,
"acc_norm": 0.5257525929962562,
"acc_norm_stderr": 0.03496709701060229,
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.5936287801538656,
"mc2_stderr": 0.015090925037000012
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5290102389078498,
"acc_norm_stderr": 0.01458677635529431
},
"harness|hellaswag|10": {
"acc": 0.5657239593706433,
"acc_stderr": 0.004946485466544624,
"acc_norm": 0.7467635929097789,
"acc_norm_stderr": 0.0043397644342190655
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.028009138125400387,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.028009138125400387
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986476,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.0194530666092016,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.0194530666092016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105307,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7113665389527458,
"acc_stderr": 0.016203792703197793,
"acc_norm": 0.7113665389527458,
"acc_norm_stderr": 0.016203792703197793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952236,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952236
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372432,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.028408302020332694,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.028408302020332694
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854926,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854926
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.030359697079046104,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.030359697079046104
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.0202239460050743,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.0202239460050743
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355044,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.5936287801538656,
"mc2_stderr": 0.015090925037000012
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
samuelstevens/bioclip-demo | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Danjie/UofT-QA | ---
dataset_info:
features:
- name: qa_pair
dtype: string
splits:
- name: train
num_bytes: 172965
num_examples: 1339
download_size: 78486
dataset_size: 172965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Pillonneau/test | ---
license: apache-2.0
---
|
Zack157/LG | ---
license: openrail
---
|
Ocelot02/tweet-sentiment-ita-eng | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
- name: text
dtype: string
splits:
- name: train
num_bytes: 199492
num_examples: 1839
- name: validation
num_bytes: 36403
num_examples: 324
- name: test
num_bytes: 97401
num_examples: 870
download_size: 203442
dataset_size: 333296
---
# Dataset Card for "tweet-sentiment-ita-eng"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdrianM0/RedDB | ---
license: mit
---
|
recastai/openassistant-guanaco-chatml | ---
dataset_info:
features:
- name: text
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 31236425.236542758
num_examples: 9829
download_size: 18142328
dataset_size: 31236425.236542758
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
- text2text-generation
---
# Dataset Card for "openassistant-guanaco-chatml "
## Dataset Summary
This dataset has been created by **Re:cast AI** to transform the existing dataset [openassistant-guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) into a [chatml](https://huggingface.co/docs/transformers/main/en/chat_templating) friendly format for use in SFT tasks with pretrained models.
The following changes have been made:
1. All conversations end in the assistant response.
2. Each example has a corresponding 'language' category that corresponds to the language use in the example.
## Dataset Structure
```python
Dataset({
features: ['text', 'messages', 'language'],
num_rows: 9829
})
messages[
{'content': 'Can you write a short introduction about the relevance of... etc.', 'role': 'user'},
{'content': '"Monopsony" refers to a market structure where there is... etc.','role': 'assistant'}
]
```
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("recastai/openassistant-guanaco-chatml", split="train")
```
## Modification
Example of applying a custom system message of your choice for chatml training.
```python
INSTRUCTIONS = (
"You are an expert AI assistant that helps users answer questions over a variety of topics. Some rules you always follow\n"
"1. INSERT YOUR RULES HERE"
)
def apply_system_message(example):
example['messages'].insert(0, {'content': INSTRUCTIONS, 'role': 'system'})
return example
dataset = dataset.map(apply_system_message)
``` |
polinaeterna/test_push_dataset_dict_infos_json | ---
dataset_info:
- config_name: default
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 3200
num_examples: 200
download_size: 5578
dataset_size: 4800
- config_name: v2
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 3200
num_examples: 200
- name: random
num_bytes: 4800
num_examples: 300
download_size: 0
dataset_size: 8000
configs_kwargs:
- config_name: default
data_dir: ./
- config_name: v2
data_dir: v2
---
# Dataset Card for "test_push_dataset_dict_infos_json"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ThWu/dpo_openhermes | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 557125321
num_examples: 182859
download_size: 289321706
dataset_size: 557125321
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dpo_openhermes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vigneshgs7/Boundary_detection_Doc_8 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 17515986715.0
num_examples: 352
download_size: 1159176262
dataset_size: 17515986715.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BennettYeung/ply-dataset | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-gigaword-default-50c095-2587478720 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- gigaword
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: []
dataset_name: gigaword
dataset_config: default
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: gigaword
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Xiaoci](https://huggingface.co/Xiaoci) for evaluating this model. |
EiffL/DESI | ---
license: mit
dataset_info:
config_name: sv3
features:
- name: TARGETID
dtype: int64
- name: SURVEY
dtype: binary
- name: PROGRAM
dtype: binary
- name: HEALPIX
dtype: int32
- name: TARGET_RA
dtype: float64
- name: TARGET_DEC
dtype: float64
- name: RELEASE
dtype: int16
- name: BRICKID
dtype: int32
- name: BRICK_OBJID
dtype: int32
- name: Z
dtype: float64
- name: EBV
dtype: float32
- name: FLUX_G
dtype: float32
- name: FLUX_R
dtype: float32
- name: FLUX_Z
dtype: float32
- name: FLUX_IVAR_G
dtype: float32
- name: FLUX_IVAR_R
dtype: float32
- name: FLUX_IVAR_Z
dtype: float32
- name: wave
dtype: float32
- name: flux
sequence: float32
length: 7781
- name: ivar
sequence: float32
length: 7781
splits:
- name: train
num_bytes: 72417839557
num_examples: 1126441
download_size: 70656849405
dataset_size: 72417839557
task_categories:
- feature-extraction
size_categories:
- 1M<n<10M
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nastyboget/stackmix_hkr_large | ---
license: mit
task_categories:
- image-to-text
language:
- ru
size_categories:
- 1M<n<10M
---
Dataset generated from HKR train set using Stackmix
=========================================
Number of images: 2476836
Sources:
* [HKR dataset](https://github.com/abdoelsayed2016/HKR_Dataset)
* [Stackmix code](https://github.com/ai-forever/StackMix-OCR)
|
zyznull/msmarco-passage-ranking | ---
license: apache-2.0
---
|
sloppysid/call_trans | ---
license: apache-2.0
---
|
AdapterOcean/code_instructions_standardized_cluster_4_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 32006346
num_examples: 20068
download_size: 15837341
dataset_size: 32006346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_4_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
napatswift/budget-seq2seq-json | ---
dataset_info:
features:
- name: line_item
sequence: string
- name: target
dtype: string
- name: input
dtype: string
- name: format
dtype: string
splits:
- name: train
num_bytes: 231359400.0
num_examples: 19075
download_size: 47272901
dataset_size: 231359400.0
---
# Dataset Card for "budget-seq2seq-json"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
efederici/alpaca-vs-alpaca-dpo | ---
language:
- en
size_categories:
- 10K<n<100K
pretty_name: alpaca_vs_alpaca
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 64319355
num_examples: 49194
download_size: 36898348
dataset_size: 64319355
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- dpo
- rlhf
- synthetic
---
# Alpaca vs. Alpaca
<img src="./alpacavsalpaca.jpeg" style="display: block; margin-left: auto; margin-right: auto; width: 30%;">
##ย Dataset Description
The Alpaca vs. Alpaca dataset is a curated blend of the [Alpaca dataset](https://huggingface.co/datasets/tatsu-lab/alpaca) and the [Alpaca GPT-4 dataset](https://huggingface.co/datasets/vicgalle/alpaca-gpt4), both available on HuggingFace Datasets. It uses the standard GPT dataset as the 'rejected' answer, steering the model towards the GPT-4 answer, which is considered as the 'chosen' one.
However, it's important to note that the 'correctness' here is not absolute. The premise is based on the assumption that GPT-4 answers are generally superior in terms of coherence, grammar, and style, and therefore, would be preferred in a human evaluation context. This might not always be the case.
The dataset has been filtered to exclude rows referencing GPT-4, rows with identical outputs from both models, and instances where GPT-4 declined to respond (some of them).
The dataset is primarily designed for conversational tasks, to train reward models or apply techniques like DPO.
### Citation Information
If you use this dataset in your work, please cite the original Alpaca dataset:
```
@misc{alpaca,
author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
title = {Stanford Alpaca: An Instruction-following LLaMA model},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}
``` |
BangumiBase/katanagatari | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Katanagatari
This is the image base of bangumi Katanagatari, we detected 22 characters, 2116 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 89 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 32 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 32 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 62 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 17 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 13 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 15 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 21 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 9 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 791 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 60 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 21 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 19 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 586 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 54 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 24 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 19 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 7 | [Download](17/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 18 | 18 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 8 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 64 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 155 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Qdrant/dbpedia-entities-openai3-text-embedding-3-large-1536-100K | ---
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: text-embedding-3-large-1536-embedding
sequence: float64
splits:
- name: train
num_bytes: 1267935009
num_examples: 100000
download_size: 955289024
dataset_size: 1267935009
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Snoopy04/hellaswag-sv-500 | ---
dataset_info:
features:
- name: split
dtype: string
- name: ind
dtype: int64
- name: split_type
dtype: string
- name: ctx_a
dtype: string
- name: ctx
dtype: string
- name: id
dtype: string
- name: label
dtype: string
- name: endings
sequence: string
- name: ctx_b
dtype: string
- name: activity_label
dtype: string
- name: source_id
dtype: string
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 1030308.594606446
num_examples: 500
- name: test
num_bytes: 1030308.594606446
num_examples: 500
download_size: 1215911
dataset_size: 2060617.189212892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
linhqyy/data_aug | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: sentence_annotation
dtype: string
- name: intent
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
splits:
- name: train
num_bytes: 330965
num_examples: 1273
download_size: 95261
dataset_size: 330965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_aug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.all_61 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30556270281.0
num_examples: 268795
download_size: 30313838733
dataset_size: 30556270281.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
Gargaz/Human-01 | ---
license: apache-2.0
---
|
zolak/twitter_dataset_50_1713126386 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 322834
num_examples: 731
download_size: 159502
dataset_size: 322834
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp | ---
pretty_name: Evaluation run of eren23/NeuralDareBeagle-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/NeuralDareBeagle-7B-slerp](https://huggingface.co/eren23/NeuralDareBeagle-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T18:11:46.511504](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp/blob/main/results_2024-01-28T18-11-46.511504.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554392261924249,\n\
\ \"acc_stderr\": 0.03212679462957801,\n \"acc_norm\": 0.6550602589470452,\n\
\ \"acc_norm_stderr\": 0.032794301399577036,\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6918000534624221,\n\
\ \"mc2_stderr\": 0.014976389591941985\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.01344952210993249,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7094204341764588,\n\
\ \"acc_stderr\": 0.004531019159414108,\n \"acc_norm\": 0.8819956184027086,\n\
\ \"acc_norm_stderr\": 0.0032195397905004732\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6918000534624221,\n\
\ \"mc2_stderr\": 0.014976389591941985\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498435\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \
\ \"acc_stderr\": 0.012551285331470152\n }\n}\n```"
repo_url: https://huggingface.co/eren23/NeuralDareBeagle-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|arc:challenge|25_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|gsm8k|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hellaswag|10_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T18-11-46.511504.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- '**/details_harness|winogrande|5_2024-01-28T18-11-46.511504.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T18-11-46.511504.parquet'
- config_name: results
data_files:
- split: 2024_01_28T18_11_46.511504
path:
- results_2024-01-28T18-11-46.511504.parquet
- split: latest
path:
- results_2024-01-28T18-11-46.511504.parquet
---
# Dataset Card for Evaluation run of eren23/NeuralDareBeagle-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/NeuralDareBeagle-7B-slerp](https://huggingface.co/eren23/NeuralDareBeagle-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T18:11:46.511504](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp/blob/main/results_2024-01-28T18-11-46.511504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6554392261924249,
"acc_stderr": 0.03212679462957801,
"acc_norm": 0.6550602589470452,
"acc_norm_stderr": 0.032794301399577036,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6918000534624221,
"mc2_stderr": 0.014976389591941985
},
"harness|arc:challenge|25": {
"acc": 0.6953924914675768,
"acc_stderr": 0.01344952210993249,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.013106784883601333
},
"harness|hellaswag|10": {
"acc": 0.7094204341764588,
"acc_stderr": 0.004531019159414108,
"acc_norm": 0.8819956184027086,
"acc_norm_stderr": 0.0032195397905004732
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.016574027219517635,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.016574027219517635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.6918000534624221,
"mc2_stderr": 0.014976389591941985
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498435
},
"harness|gsm8k|5": {
"acc": 0.7058377558756633,
"acc_stderr": 0.012551285331470152
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Cornchips1234/Artstyle_test | ---
license: creativeml-openrail-m
task_categories:
- feature-extraction
language:
- en
tags:
- art
pretty_name: snorple
size_categories:
- n<1K
--- |
Jing24/high-train1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 43242286
num_examples: 47599
download_size: 27359389
dataset_size: 43242286
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "high-train1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edgarseverino/andreavocal | ---
license: openrail
---
|
bigscience-data/roots_id_wikibooks | ---
language: id
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_id_wikibooks
# wikibooks_filtered
- Dataset uid: `wikibooks_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0897 % of total
- 0.2591 % of en
- 0.0965 % of fr
- 0.1691 % of es
- 0.2834 % of indic-hi
- 0.2172 % of pt
- 0.0149 % of zh
- 0.0279 % of ar
- 0.1374 % of vi
- 0.5025 % of id
- 0.3694 % of indic-ur
- 0.5744 % of eu
- 0.0769 % of ca
- 0.0519 % of indic-ta
- 0.1470 % of indic-mr
- 0.0751 % of indic-te
- 0.0156 % of indic-bn
- 0.0476 % of indic-ml
- 0.0087 % of indic-pa
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-pa
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
weijie210/UC_prefs_iter_0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: critique
dtype: string
- name: post_score
dtype: int64
- name: pre_score
dtype: int64
- name: score_diff
dtype: int64
- name: subsitute
dtype: bool
splits:
- name: train_sft
num_bytes: 90413
num_examples: 16
- name: test_sft
num_bytes: 84859
num_examples: 15
download_size: 135684
dataset_size: 175272
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
|
open-llm-leaderboard/details_ddobokki__Llama-2-70b-orca-200k | ---
pretty_name: Evaluation run of ddobokki/Llama-2-70b-orca-200k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ddobokki/Llama-2-70b-orca-200k](https://huggingface.co/ddobokki/Llama-2-70b-orca-200k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ddobokki__Llama-2-70b-orca-200k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-09T20:21:28.711089](https://huggingface.co/datasets/open-llm-leaderboard/details_ddobokki__Llama-2-70b-orca-200k/blob/main/results_2023-08-09T20%3A21%3A28.711089.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6675525003199225,\n\
\ \"acc_stderr\": 0.0320256356518761,\n \"acc_norm\": 0.6716808545277895,\n\
\ \"acc_norm_stderr\": 0.03199912887877205,\n \"mc1\": 0.408812729498164,\n\
\ \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5618014117500216,\n\
\ \"mc2_stderr\": 0.015000194909320638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5989761092150171,\n \"acc_stderr\": 0.014322255790719867,\n\
\ \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600935\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6584345747858992,\n\
\ \"acc_stderr\": 0.004732654295724444,\n \"acc_norm\": 0.8525194184425413,\n\
\ \"acc_norm_stderr\": 0.0035385967737048313\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594962,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594962\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8129032258064516,\n \"acc_stderr\": 0.02218571009225225,\n \"\
acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.02218571009225225\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.0209868545932897,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.0209868545932897\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.02361088430892786,\n \
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.04006485685365343,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.04006485685365343\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168585,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846315,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7488789237668162,\n\
\ \"acc_stderr\": 0.02910522083322462,\n \"acc_norm\": 0.7488789237668162,\n\
\ \"acc_norm_stderr\": 0.02910522083322462\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.859514687100894,\n\
\ \"acc_stderr\": 0.012426211353093438,\n \"acc_norm\": 0.859514687100894,\n\
\ \"acc_norm_stderr\": 0.012426211353093438\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967558,\n\
\ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967558\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5363128491620112,\n\
\ \"acc_stderr\": 0.016678341894533162,\n \"acc_norm\": 0.5363128491620112,\n\
\ \"acc_norm_stderr\": 0.016678341894533162\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7395498392282959,\n\
\ \"acc_stderr\": 0.02492672322484554,\n \"acc_norm\": 0.7395498392282959,\n\
\ \"acc_norm_stderr\": 0.02492672322484554\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451152,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451152\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5084745762711864,\n\
\ \"acc_stderr\": 0.012768401697269048,\n \"acc_norm\": 0.5084745762711864,\n\
\ \"acc_norm_stderr\": 0.012768401697269048\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7075163398692811,\n \"acc_stderr\": 0.018403415710109793,\n \
\ \"acc_norm\": 0.7075163398692811,\n \"acc_norm_stderr\": 0.018403415710109793\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073153,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073153\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061445,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061445\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n\
\ \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5618014117500216,\n\
\ \"mc2_stderr\": 0.015000194909320638\n }\n}\n```"
repo_url: https://huggingface.co/ddobokki/Llama-2-70b-orca-200k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|arc:challenge|25_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hellaswag|10_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:21:28.711089.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:21:28.711089.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T20:21:28.711089.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T20:21:28.711089.parquet'
- config_name: results
data_files:
- split: 2023_08_09T20_21_28.711089
path:
- results_2023-08-09T20:21:28.711089.parquet
- split: latest
path:
- results_2023-08-09T20:21:28.711089.parquet
---
# Dataset Card for Evaluation run of ddobokki/Llama-2-70b-orca-200k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ddobokki/Llama-2-70b-orca-200k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ddobokki/Llama-2-70b-orca-200k](https://huggingface.co/ddobokki/Llama-2-70b-orca-200k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ddobokki__Llama-2-70b-orca-200k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T20:21:28.711089](https://huggingface.co/datasets/open-llm-leaderboard/details_ddobokki__Llama-2-70b-orca-200k/blob/main/results_2023-08-09T20%3A21%3A28.711089.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6675525003199225,
"acc_stderr": 0.0320256356518761,
"acc_norm": 0.6716808545277895,
"acc_norm_stderr": 0.03199912887877205,
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5618014117500216,
"mc2_stderr": 0.015000194909320638
},
"harness|arc:challenge|25": {
"acc": 0.5989761092150171,
"acc_stderr": 0.014322255790719867,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600935
},
"harness|hellaswag|10": {
"acc": 0.6584345747858992,
"acc_stderr": 0.004732654295724444,
"acc_norm": 0.8525194184425413,
"acc_norm_stderr": 0.0035385967737048313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594962,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594962
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.0209868545932897,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.0209868545932897
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634335,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634335
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.04006485685365343,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.04006485685365343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.015014462497168585,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.015014462497168585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846315,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7488789237668162,
"acc_stderr": 0.02910522083322462,
"acc_norm": 0.7488789237668162,
"acc_norm_stderr": 0.02910522083322462
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.859514687100894,
"acc_stderr": 0.012426211353093438,
"acc_norm": 0.859514687100894,
"acc_norm_stderr": 0.012426211353093438
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967558,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967558
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5363128491620112,
"acc_stderr": 0.016678341894533162,
"acc_norm": 0.5363128491620112,
"acc_norm_stderr": 0.016678341894533162
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7395498392282959,
"acc_stderr": 0.02492672322484554,
"acc_norm": 0.7395498392282959,
"acc_norm_stderr": 0.02492672322484554
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451152,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451152
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5084745762711864,
"acc_stderr": 0.012768401697269048,
"acc_norm": 0.5084745762711864,
"acc_norm_stderr": 0.012768401697269048
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7075163398692811,
"acc_stderr": 0.018403415710109793,
"acc_norm": 0.7075163398692811,
"acc_norm_stderr": 0.018403415710109793
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061445,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061445
},
"harness|truthfulqa:mc|0": {
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5618014117500216,
"mc2_stderr": 0.015000194909320638
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ssbuild/tools_data | ---
license: apache-2.0
---
|
Falah/book_cover_prompts_with_sections | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 393452
num_examples: 1000
download_size: 45494
dataset_size: 393452
---
# Dataset Card for "book_cover_prompts_with_sections"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hezarai/xlsum-fa | ---
task_categories:
- summarization
language:
- fa
pretty_name: XLSum Persian
---
The Persian portion of the [XLSum](https://huggingface.co/datasets/csebuetnlp/xlsum) dataset.
### Citation
```bibtex
@inproceedings{hasan-etal-2021-xl,
title = "{XL}-Sum: Large-Scale Multilingual Abstractive Summarization for 44 Languages",
author = "Hasan, Tahmid and
Bhattacharjee, Abhik and
Islam, Md. Saiful and
Mubasshir, Kazi and
Li, Yuan-Fang and
Kang, Yong-Bin and
Rahman, M. Sohel and
Shahriyar, Rifat",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.413",
pages = "4693--4703",
}
``` |
carnival13/rbrt_full_uda_large_ep5 | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1219081708
num_examples: 824810
download_size: 422786339
dataset_size: 1219081708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rbrt_full_uda_large_ep5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CJWeiss/LexGenZero_multitiny | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: input
dtype: string
- name: output
dtype: string
- name: fk_grade
dtype: float64
- name: cluster
dtype: string
- name: old_id
dtype: int64
splits:
- name: train
num_bytes: 109780629
num_examples: 50
download_size: 50133176
dataset_size: 109780629
---
# Dataset Card for "LexGenZero_multitiny"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-32b-v21.2-32k | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-qwen1.5-32b-v21.2-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-qwen1.5-32b-v21.2-32k](https://huggingface.co/OpenBuddy/openbuddy-qwen1.5-32b-v21.2-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-32b-v21.2-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T21:59:46.714800](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-32b-v21.2-32k/blob/main/results_2024-04-15T21-59-46.714800.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7208754115092931,\n\
\ \"acc_stderr\": 0.029652666839758854,\n \"acc_norm\": 0.7328674140340808,\n\
\ \"acc_norm_stderr\": 0.030277861081761843,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314757,\n \"mc2\": 0.5918848188383334,\n\
\ \"mc2_stderr\": 0.014763665308482273\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257182,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6383190599482175,\n\
\ \"acc_stderr\": 0.004795051037917737,\n \"acc_norm\": 0.8323043218482374,\n\
\ \"acc_norm_stderr\": 0.0037283229688748953\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632726,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.026199808807561915,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.026199808807561915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802269,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802269\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7225433526011561,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.7225433526011561,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7191489361702128,\n \"acc_stderr\": 0.029379170464124825,\n\
\ \"acc_norm\": 0.7191489361702128,\n \"acc_norm_stderr\": 0.029379170464124825\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n\
\ \"acc_stderr\": 0.04644602091222317,\n \"acc_norm\": 0.5789473684210527,\n\
\ \"acc_norm_stderr\": 0.04644602091222317\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n\
\ \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6455026455026455,\n \"acc_stderr\": 0.024636830602841997,\n \"\
acc_norm\": 0.6455026455026455,\n \"acc_norm_stderr\": 0.024636830602841997\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.864516129032258,\n \"acc_stderr\": 0.019469334586486937,\n \"\
acc_norm\": 0.864516129032258,\n \"acc_norm_stderr\": 0.019469334586486937\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\"\
: 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706473,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706473\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476439,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476439\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7666666666666667,\n \"acc_stderr\": 0.021444547301560476,\n\
\ \"acc_norm\": 0.7666666666666667,\n \"acc_norm_stderr\": 0.021444547301560476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \
\ \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.024762902678057933,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.024762902678057933\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449655,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449655\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8972477064220183,\n \"acc_stderr\": 0.013018246509173768,\n \"\
acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.013018246509173768\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6712962962962963,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.6712962962962963,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552104,\n \"\
acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.02068174513588455,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.02068174513588455\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.028568079464714274,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.028568079464714274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.01604626163167314,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.01604626163167314\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8901660280970626,\n\
\ \"acc_stderr\": 0.01118151050324705,\n \"acc_norm\": 0.8901660280970626,\n\
\ \"acc_norm_stderr\": 0.01118151050324705\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252552,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252552\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5642458100558659,\n\
\ \"acc_stderr\": 0.016583881958602397,\n \"acc_norm\": 0.5642458100558659,\n\
\ \"acc_norm_stderr\": 0.016583881958602397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8104575163398693,\n \"acc_stderr\": 0.022442358263336185,\n\
\ \"acc_norm\": 0.8104575163398693,\n \"acc_norm_stderr\": 0.022442358263336185\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.02322275679743509,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.02322275679743509\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.808641975308642,\n \"acc_stderr\": 0.021887704613396158,\n\
\ \"acc_norm\": 0.808641975308642,\n \"acc_norm_stderr\": 0.021887704613396158\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5602836879432624,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5860495436766623,\n\
\ \"acc_stderr\": 0.012579699631289264,\n \"acc_norm\": 0.5860495436766623,\n\
\ \"acc_norm_stderr\": 0.012579699631289264\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.024880971512294257,\n\
\ \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.024880971512294257\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7679738562091504,\n \"acc_stderr\": 0.017077373377856933,\n \
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.017077373377856933\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827044,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314757,\n \"mc2\": 0.5918848188383334,\n\
\ \"mc2_stderr\": 0.014763665308482273\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218317\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15466262319939347,\n \
\ \"acc_stderr\": 0.009959786220917198\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-qwen1.5-32b-v21.2-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-59-46.714800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-59-46.714800.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- '**/details_harness|winogrande|5_2024-04-15T21-59-46.714800.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T21-59-46.714800.parquet'
- config_name: results
data_files:
- split: 2024_04_15T21_59_46.714800
path:
- results_2024-04-15T21-59-46.714800.parquet
- split: latest
path:
- results_2024-04-15T21-59-46.714800.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-qwen1.5-32b-v21.2-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-qwen1.5-32b-v21.2-32k](https://huggingface.co/OpenBuddy/openbuddy-qwen1.5-32b-v21.2-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-32b-v21.2-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T21:59:46.714800](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-qwen1.5-32b-v21.2-32k/blob/main/results_2024-04-15T21-59-46.714800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7208754115092931,
"acc_stderr": 0.029652666839758854,
"acc_norm": 0.7328674140340808,
"acc_norm_stderr": 0.030277861081761843,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314757,
"mc2": 0.5918848188383334,
"mc2_stderr": 0.014763665308482273
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257182,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094094
},
"harness|hellaswag|10": {
"acc": 0.6383190599482175,
"acc_stderr": 0.004795051037917737,
"acc_norm": 0.8323043218482374,
"acc_norm_stderr": 0.0037283229688748953
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.030167533468632726,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.030167533468632726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.026199808807561915,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.026199808807561915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802269,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802269
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7191489361702128,
"acc_stderr": 0.029379170464124825,
"acc_norm": 0.7191489361702128,
"acc_norm_stderr": 0.029379170464124825
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04644602091222317,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04644602091222317
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6455026455026455,
"acc_stderr": 0.024636830602841997,
"acc_norm": 0.6455026455026455,
"acc_norm_stderr": 0.024636830602841997
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.019469334586486937,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.019469334586486937
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706473,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706473
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476439,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476439
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7666666666666667,
"acc_stderr": 0.021444547301560476,
"acc_norm": 0.7666666666666667,
"acc_norm_stderr": 0.021444547301560476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.024762902678057933,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.024762902678057933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449655,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449655
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.013018246509173768,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.013018246509173768
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6712962962962963,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.6712962962962963,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552104,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.02068174513588455,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.02068174513588455
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.028568079464714274,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.028568079464714274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.01604626163167314,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.01604626163167314
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8901660280970626,
"acc_stderr": 0.01118151050324705,
"acc_norm": 0.8901660280970626,
"acc_norm_stderr": 0.01118151050324705
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252552,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252552
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5642458100558659,
"acc_stderr": 0.016583881958602397,
"acc_norm": 0.5642458100558659,
"acc_norm_stderr": 0.016583881958602397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8104575163398693,
"acc_stderr": 0.022442358263336185,
"acc_norm": 0.8104575163398693,
"acc_norm_stderr": 0.022442358263336185
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.02322275679743509,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.02322275679743509
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.808641975308642,
"acc_stderr": 0.021887704613396158,
"acc_norm": 0.808641975308642,
"acc_norm_stderr": 0.021887704613396158
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5860495436766623,
"acc_stderr": 0.012579699631289264,
"acc_norm": 0.5860495436766623,
"acc_norm_stderr": 0.012579699631289264
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.024880971512294257,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.024880971512294257
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.017077373377856933,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.017077373377856933
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827044,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314757,
"mc2": 0.5918848188383334,
"mc2_stderr": 0.014763665308482273
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218317
},
"harness|gsm8k|5": {
"acc": 0.15466262319939347,
"acc_stderr": 0.009959786220917198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
guidobenb/CVDB_NER | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-ACTION
'2': I-ACTION
'3': B-ACTOR
'4': I-ACTOR
'5': B-ASSETS
'6': I-ASSETS
splits:
- name: train
num_bytes: 1166065.992992993
num_examples: 899
- name: test
num_bytes: 64853.5035035035
num_examples: 50
- name: valid
num_bytes: 64853.5035035035
num_examples: 50
download_size: 224110
dataset_size: 1295773.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
Solshine/SemiSynthetic_Data_For_Regenerative_Farming_Agriculture | ---
license: mit
---
Dataset for Agricultural/Farming methods which increase fertility. Dataset contains scenarios and action suggestionss, with intended outcomes. The scenarios are puzzling conundrums on a farm or garden and the actions are informed by Regenerative Agriculture and Natural Farming principles and practices.
Regarding Regenerative Farming practices, and Regenerative Farming.
"What is Regenerative Agriculture?
Regenerative agriculture takes a systems-based, holistic look at the land being stewarded and applies various principles with the goal of making the land more productive and biodiverse over time. In most situations, improving soil health and function is the key to improving productivity and biodiversity. One of the key components of healthy soil is organic matter, which is anything that is alive or was once living, such as a plant root, an earthworm, or a microbe. " -Kiss The Ground Documentary
This curated dataset was create semi-synthetically using a RAG system containing regenerative agriculture data for various plants, sourced from agricultural college public data and extension offices' public data, along with open nutrient projects data, connected to a ChatGPT4 API, put together by Copyleft Cultivars Nonprofit, then cleaned lightly by Caleb DeLeeuw (@Solshine on Hugging Face.)
This dataset was created and curated in coordination with domain experts in Regenerative Farming and Natural Farming.
The dataset is in json. |
mii-llm/gpt4-eimu-augment | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 5964562
num_examples: 1722
download_size: 3067207
dataset_size: 5964562
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gpt4-eimu-augment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/helmet_detection | ---
license: cc-by-nc-nd-4.0
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: bboxes
dtype: string
splits:
- name: train
num_bytes: 56575701
num_examples: 46
download_size: 56584366
dataset_size: 56575701
task_categories:
- image-classification
language:
- en
tags:
- code
---
# Helmet Detection Dataset
The dataset consist of photographs of construction workers during the work. The dataset provides helmet detection using bounding boxes, and addresses public safety tasks such as providing compliance with safety regulations, authomizing the processes of identification of rules violations and reducing accidents during the construction work.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/helmet-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=helmet_detection) to discuss your requirements, learn about the price and buy the dataset.

# Dataset structure
- **img** - contains of original images of construction workers
- **boxes** - includes bounding box labeling for the original images
- **annotations.xml** - contains coordinates of the bounding boxes and labels (helmet, no_helmet), created for the original photo
# Data Format
Each image from `img` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the bounding boxes and labels for helmet detection. For each point, the x and y coordinates are provided.
# Example of XML file structure
.png?generation=1686295970420156&alt=media)
# Helmet Detection might be made in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/helmet-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=helmet_detection) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
|
wennnny/wine_review | ---
dataset_info:
features:
- name: wine_id
dtype: int64
- name: country
dtype: string
- name: description
dtype: string
- name: designation
dtype: string
- name: points
dtype: int64
- name: price
dtype: float64
splits:
- name: train
num_bytes: 21093175.17523332
num_examples: 68918
- name: test
num_bytes: 5273446.824766681
num_examples: 17230
download_size: 15117032
dataset_size: 26366622.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Patt/copa_th | ---
language:
- th
- en
license: cc-by-sa-4.0
---
# Dataset Card for copa_th
### Dataset Description
This dataset is Thai translated version of [copa](https://huggingface.co/datasets/super_glue/viewer/copa) using google translate with [Multilingual Universal Sentence Encoder](https://arxiv.org/abs/1907.04307) to calculate score for Thai translation.
### Languages
- EN
- TH
|
Code-Refinement/utf_20_refs_file_sample100 | ---
dataset_info:
features:
- name: problem_id
dtype: int64
- name: question
dtype: string
- name: solutions
dtype: string
- name: input_output
struct:
- name: inputs
sequence: string
- name: outputs
sequence: string
- name: difficulty
dtype: string
- name: url
dtype: string
- name: starter_code
dtype: string
- name: is_call_based
dtype: bool
- name: code_initial
dtype: string
- name: feedback_initial
dtype: string
- name: r_initial
dtype: float64
- name: sol_idx
dtype: int64
- name: chosen_ref_id
dtype: int64
- name: chosen_refinement
dtype: string
- name: chosen_reward
dtype: float64
- name: rejected_ref_id
dtype: int64
- name: rejected_refinement
dtype: string
- name: rejected_reward
dtype: float64
- name: branch_weight
dtype: float64
splits:
- name: train
num_bytes: 3872897
num_examples: 100
- name: test
num_bytes: 935660
num_examples: 100
download_size: 679929
dataset_size: 4808557
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
zolak/twitter_dataset_81_1713216143 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1399702
num_examples: 3458
download_size: 708400
dataset_size: 1399702
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Kukedlc__Ramakrishna-7b-v3 | ---
pretty_name: Evaluation run of Kukedlc/Ramakrishna-7b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/Ramakrishna-7b-v3](https://huggingface.co/Kukedlc/Ramakrishna-7b-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__Ramakrishna-7b-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:08:57.812290](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Ramakrishna-7b-v3/blob/main/results_2024-03-29T20-08-57.812290.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651989434078567,\n\
\ \"acc_stderr\": 0.032058983881475295,\n \"acc_norm\": 0.6513495338928594,\n\
\ \"acc_norm_stderr\": 0.032728855858660366,\n \"mc1\": 0.6156670746634026,\n\
\ \"mc1_stderr\": 0.017028707301245217,\n \"mc2\": 0.7666543089649267,\n\
\ \"mc2_stderr\": 0.013927924378838195\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7064846416382252,\n \"acc_stderr\": 0.013307250444941117,\n\
\ \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.012875929151297044\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7149970125473013,\n\
\ \"acc_stderr\": 0.0045049329997364105,\n \"acc_norm\": 0.8899621589324835,\n\
\ \"acc_norm_stderr\": 0.003122973632039471\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464086,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464086\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823698,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6156670746634026,\n\
\ \"mc1_stderr\": 0.017028707301245217,\n \"mc2\": 0.7666543089649267,\n\
\ \"mc2_stderr\": 0.013927924378838195\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7020470053070508,\n \
\ \"acc_stderr\": 0.012597932232914522\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/Ramakrishna-7b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-08-57.812290.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-08-57.812290.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- '**/details_harness|winogrande|5_2024-03-29T20-08-57.812290.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-08-57.812290.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_08_57.812290
path:
- results_2024-03-29T20-08-57.812290.parquet
- split: latest
path:
- results_2024-03-29T20-08-57.812290.parquet
---
# Dataset Card for Evaluation run of Kukedlc/Ramakrishna-7b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/Ramakrishna-7b-v3](https://huggingface.co/Kukedlc/Ramakrishna-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__Ramakrishna-7b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:08:57.812290](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Ramakrishna-7b-v3/blob/main/results_2024-03-29T20-08-57.812290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651989434078567,
"acc_stderr": 0.032058983881475295,
"acc_norm": 0.6513495338928594,
"acc_norm_stderr": 0.032728855858660366,
"mc1": 0.6156670746634026,
"mc1_stderr": 0.017028707301245217,
"mc2": 0.7666543089649267,
"mc2_stderr": 0.013927924378838195
},
"harness|arc:challenge|25": {
"acc": 0.7064846416382252,
"acc_stderr": 0.013307250444941117,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.012875929151297044
},
"harness|hellaswag|10": {
"acc": 0.7149970125473013,
"acc_stderr": 0.0045049329997364105,
"acc_norm": 0.8899621589324835,
"acc_norm_stderr": 0.003122973632039471
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464086,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464086
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823698,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6156670746634026,
"mc1_stderr": 0.017028707301245217,
"mc2": 0.7666543089649267,
"mc2_stderr": 0.013927924378838195
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775778
},
"harness|gsm8k|5": {
"acc": 0.7020470053070508,
"acc_stderr": 0.012597932232914522
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
indonlp/nusatranslation_emot | ---
license: apache-2.0
---
|
smrynrz20/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.